Posted by
Oshel, Philip Eugene on
URL: http://confocal-microscopy-list.275.s1.nabble.com/Web-app-to-direct-user-to-the-right-microscope-tp7592166p7592170.html
*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopyPost images on
http://www.imgur.com and include the link in your posting.
*****
I'm not sure an app like this is the right approach, at least at first.
The user needs to state explicitly what is the question. The question drives the method. I often find users haven't really properly defined their question(s), so any attempt to answer "what kind of microscopy do I need" is at best premature.
Once the user has a clear statement of their question -- not what data or kind of data they're looking for, but the *question* -- they can then determine the proper controls (yes, plural; only needing one control is rare). Only after both the question(s) and controls are unambiguously determined can "what kind of microscopy?" be addressed. (And the answer may well be "None. Do 'method X' instead".)
An app that forced the user to do this would be very valuable.
-------------
Philip Oshel
Imaging Facility Director
Biology Department
1304 Biosciences
1455 Calumet Ct.
Central Michigan University
Mt. Pleasant, MI 48859
989 774-3576 office
-----Original Message-----
From: Confocal Microscopy List <
[hidden email]> on behalf of Benjamin Smith <
[hidden email]>
Reply-To: Confocal Microscopy List <
[hidden email]>
Date: Wednesday, 28April, 2021 at 13:44
To: "
[hidden email]" <
[hidden email]>
Subject: [External] Re: Web app to direct user to the right microscope?
*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy Post images on
http://www.imgur.com and include the link in your posting.
*****
Along the lines of what Craig said, multimodal imaging approaches are also
best, and answers would also be dependent on the users willingness to use
non-traditional methods regardless of how easy they may be to implement
(suck as Rheinberg illumination). Also, the user would need to be aware of
how they intend to quantify their data to know therefore how they need to
collect that data, and the resolution they need, FOV, colocalization,
multi-spectral, how many dimensions, etc. Many times I have run into
people who collected months worth of data using the correct instrument,
only to find out the data was collected incorrectly and cannot be
quantified, resulting in needing to reacquire the entire set again
correctly.
For example, if you wanted to measure something as "simple" as
colocalization, depending on the nature of the question and resources
available you could do BiFC, fluorescence anisotropy, FRET, FCCS, super
resolution (Airy scanning, STORM, STED, GSD, etc.), CLEM, hyperspectral
microscopy, the list goes on and on. On top of that, for each of these
modalities, there are further considerations for what probes to use
(especially for BiFC, as well as FRET and STORM) and how best to set up the
experiment. Then comes the whole other half which is how to quantify
colocalization data, and the amount and type of information that can be
gleaned by each of these methods.
Although, I could also be grossly over complicating things, but in my
experience users often don't even know how to phrase what it is that they
want to do in terms of what the microscope is capable of, which is the
whole reason why skilled microscopists are highly sought after to help
bridge that gap. However, if someone had the insight to condense all this
into an app a non-microscopist could easily navigate and understand, that
would be revolutionary to the field and definitely would be a major
contribution to the scientific endeavor that would be greatly appreciated
by millions! I just hope it wouldn't encourage even more people to collect
months of worthless data, because they did not fully understand what they
were trying to do.
On Wed, Apr 28, 2021 at 9:55 AM Craig Brideau <
[hidden email]>
wrote:
> *****
> To join, leave or search the confocal microscopy listserv, go to:
>
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy > Post images on
http://www.imgur.com and include the link in your posting.
> *****
>
> I feel that you would have to ask far too many questions to account for
> even the more common variables in sample type, preparation, etc. even
> before you consider the equipment. I would instead consider a first-pass
> "screening questionnaire" that would be forwarded to an actual microscopy
> professional for followup.
>
> Craig
>
> On Wed, Apr 28, 2021 at 10:29 AM VERMEREN Matthieu <
>
[hidden email]> wrote:
>
> > *****
> > To join, leave or search the confocal microscopy listserv, go to:
> >
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy > > Post images on
http://www.imgur.com and include the link in your
> posting.
> > *****
> >
> > Dear all,
> >
> > I've been repetitively asked by some users if I could build a Web app
> that
> > would guide them through a series of questions (like a personality test)
> to
> > the ideal microscope(s) for their experiment. Does anyone have
> experience
> > with this? Which app have you used?
> >
> > Obviously, I'd rather they come and talk but maybe I'm just scary :-)
> >
> > Sincerely,
> >
> > Matthieu
> >
> > The University of Edinburgh is a charitable body, registered in Scotland,
> > with registration number SC005336. Is e buidheann carthannais a th' ann
> an
> > Oilthigh Dh?n ?ideann, cl?raichte an Alba, ?ireamh cl?raidh SC005336.
> >
>
--
Benjamin E. Smith, Ph. D.
Imaging Specialist, Vision Science
University of California, Berkeley
195 Weill Hall
Berkeley, CA 94720-3200
Tel (510) 642-9712
Fax (510) 643-6791
e-mail:
[hidden email]
https://vision.berkeley.edu/faculty/core-grants-nei/core-grant-microscopic-imaging/