Posted by
Mel Symeonides on
URL: http://confocal-microscopy-list.275.s1.nabble.com/PC-requirements-tp7587774p7587781.html
*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopyPost images on
http://www.imgur.com and include the link in your posting.
*****
Hi Johannes,
Thanks for your input, I appreciate it and have some comments. While
mainframe computers/clusters are in principle much faster than any
reasonable custom built PC, as per your experience at Heidelberg it can
be a challenge to actually run your analysis pipeline on it and you will
need someone on staff who can code to get it done. If you don't have
that knowledge in your immediate group, and if you are at a smaller
institute where such support is unavailable or stretched very thin and
always busy, the custom PC could get the job done in very reasonable
time (It took me about 3.5 hours to deskew a light sheet stage scan
dataset of 500 planes x 2000 timepoints x 2 colors at 512x512
resolution) with readily available and optimized programs like ImageJ or
commercial turnkey packages.
I don't think $9,000 is a prohibitive cost for a computer when the
instrument that generates the data can cost 20x - 100x more than that.
The Hamamatsu sCMOS camera we bought cost more than the workstation and
data server combined. Besides all that, as you mentioned data transfer
speed can be a big issue, and I don't think it's cheap at all. I'm sure
many institutes do not have the network infrastructure to support
transferring 1 TB datasets within any reasonable timeframe (you are
lucky if you have Cat5 wiring if you are in an older building), so
farming out processing of such datasets to a mainframe may end up
actually being much slower than doing it locally if you include the data
transfer time, that is IF they even give you enough server disk space
for a whole dataset. Rewiring a building for faster networking can be a
very costly project for a university, i.e. one that will likely never be
done (the typical answer is: "What's the problem? We installed 802.11n
WiFi for you.")
Lastly, a powerful computer like that can be useful not just for
processing raw data (which a mainframe could in theory do faster) but
also to prepare data for presentation, e.g. composing flythrough 3D
movies, annotation etc., which cannot be farmed out. Such functions
require a powerful workstation PC and the fastest possible access to
your dataset (i.e. local).
Best,
Mel
On 1/15/2018 12:23 PM, P. Johannes Helm wrote:
> *****
> To join, leave or search the confocal microscopy listserv, go to:
>
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy> Post images on
http://www.imgur.com and include the link in your posting.
> *****
>
> Dear Peter and dear list,
>
> might it be an alternative to have a PC with large Graphics memory,
> only, and use this PC then solely for display purposes? The image
> analysis calculations could well be done on a large mainframe computer
> (parallel, vector, or both), which usually is provided and taken care
> of by specialized staff in a University's or Research Center's central
> IT unit. Often, there will also be qualified staff to implement
> suitable image analysis software. In addition, one would need a rapid
> data line to transfer the original data to a suitable disk on the
> mainframe computer, but all this is cheaper than a "super PC".
>
> Besides being considerably cheaper than putting a lot of money into a
> de-centralized unit, which still will be slower than a large main
> frame computer, one can avoid having to take care about any expensive
> damage, which sooner or later will materialize on the PC. Also, the IT
> unit will take care of software, hardware and firmware updates on the
> main frame computer.
>
> The disadvantages might be a somewhat reduced flexibility and that
> someone in the lab would have to learn at least some UNIX.
>
> I had, during the late 80s and early 90s, been in a lab in Heidelberg,
> FRG, where one of the old Sarastro Phoibos IV systems had been
> installed and in use. It was one of the earliest CLSMs, and it was a
> very fine one. The software was run on a Personal Iris by Silicon
> Graphics. Once the scanned data volumes became larger and larger, we
> asked for and got access to the source codes for the 3D reconstruction
> software, and a master student vectorized the source code, which then
> was installed and run on a Convex C210, then one of the fastest
> mainframe computer systems available. When initial problems with
> process swapping had been resolved, the speed of data processing
> increased roughly "by and order of magnitude", i.e. processing large
> data sets consumed about 1/10 of the time, which had been required
> before on the small frame Personal Iris.
>
> Best wishes,
>
> Johannes
>
>
>
> On 2018-01-15 13:22, Owens, Peter wrote:
>> *****
>> To join, leave or search the confocal microscopy listserv, go to:
>>
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy>> Post images on
http://www.imgur.com and include the link in your
>> posting.
>> *****
>>
>> Dear listers,
>>
>> I am looking into purchasing a high end image processing PC , that
>> will be capable of processing large multidimensional data sets up to 1
>> TB in size.
>> Does anyone have any recommendations on a PC configuration that would
>> be suitable?
>> Do people build custom PCs or buy off the shelf?
>> Are high spec gaming PCs up to this task?
>>
>> thanks for any advice on this .
>>
>> all the best
>>
>> Peter
>>
>>
>> Peter Owens
>> Centre for Microscopy and Imaging,
>> National University of Ireland Galway.
>> P: +35391494036 m: +353863326749
>> W: www.imaging.nuigalway.ie e:
[hidden email]
>>
>>
>>
>