I would recommend the Vista 64bit, 8 core with an Open GL “workstation
grade” video card and 16GB of RAM. Eventually the software you use will grow
into multi-core processing. Ignore the GPU options they are mainly for 3D
graphics and games.
As for Video, Nvidia Quadro cards and ATI Fire GL cards are the
way to go with 256MB-512MB of RAM for a new science imaging analysis workstation.
We build all our own workstations here with Quadros (FX 1500 and FX 3450s mainly)
and use RAID striped drives 2-4 disks (500GB to 1TB each disk) we place 4GB RAM
(XP) -16GB RAM (Vista). We also use Intel boards for our single processor
systems(2-4core) and Supermicro boards for our dual processor machines
(4-8cores). We stopped using AMD CPUs about 3 years ago in favor of Intel. We
also use 600W to 800W power supplies to feed these systems.
Hope this helps.
Cheers,
Jonathan M. Ekman
Imaging Technology Group
Beckman Institute for Advanced Science and Technology
University of Illinois at Urbana-Champaign
405 N. Mathews Avenue
Urbana, IL 61801 USA
Tel: 217-244-6292
Fax: 217-244-6219
From: Confocal Microscopy
List [mailto:[hidden email]] On Behalf Of Christophe
Leterrier
Sent: Monday, September 22, 2008 3:33 AM
To: [hidden email]
Subject: Advice for offline image analysis computer
Dear listers,
I have to buy a new computer for our team that will be used as for offline
image analysis. The software required (ImageJ, Metamorph, Matlab) require to
build a Windows/Vista machine. The cost would be around $5,000 / 5000€. The job
would be processing (a lot) and rendering. My questions are :
- Should I go for a "traditionnal" computer, I mean a 32-bits, dual
core, XP computer with maxed RAM (I guess it is 3GB or so) ? Is it usefull to
go to more fancy stuff like 64 bits, 4 or 8 core, 8 to 16GB RAM machine ? Would
ImageJ/Metamorph/Matlab really benefit from it (I would love to have an answer
from the Metamorph/Matlab people) ?
- What about graphic cards and GPU ? What is the best choice ? I've heard about
new strategies to speed up processing by making the GPU churn data as well as
graphics, but I don't think it is really commercially available now or
implemented in commercial software yet.
Thanks for your advices !
Christophe
Free forum by Nabble | Edit this page |