ImageJ2... dead?

classic Classic list List threaded Threaded
4 messages Options
offterdi1 offterdi1
Reply | Threaded
Open this post in threaded view
|

ImageJ2... dead?

*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy
*****

Dear all,
I am interested to hear if anyone is aware about the status of the ImageJ2
project?
There were always some news on the website, but since several months it
seems to be rather dead there? No information several beta versions delayed
many months...
Is it still continued the whole project?
Best

Martin
Kevin W Eliceiri Kevin W Eliceiri
Reply | Threaded
Open this post in threaded view
|

Re: ImageJ2... dead?

*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy
*****

Dear Martin,


Thank you for your interest. The ImageJ2 effort is very much active and the best place to see this progress is on the ImageJ listserv where the lead developers from LOCI regularly post. As announced last summer after the successful completion of several key parts of the ImageJ2 architecture such as the headless plugin framework, a emphasis was made on more application development that take advantage of this work. As well we have focused on tighter integration with FIJI (www.fiji.sc) and using FIJI as the method of choice for ImageJ2 deployment. In fact much of the ImageJ2 code base is active and used in the current FIJI as desired. We have several pilot projects with other tools that use ImageJ2/FIJI including OME, CellProfiler and KNIME. There will be several announcements on these in the coming months. In the meantime please let us know if you have any specific questions or features of interest. We do know how it can be confusing to keep track of all the developments even if on the ImageJ listserv and we are planning on putting some more effort in updated explanations for the various associated websites (i.e FIJI, ImageJdev etc).


best
kevin

On 03/26/14, Martin Offterdinger  wrote:

> *****
> To join, leave or search the confocal microscopy listserv, go to:
> http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy
> *****
>
> Dear all,
> I am interested to hear if anyone is aware about the status of the ImageJ2
> project?
> There were always some news on the website, but since several months it
> seems to be rather dead there? No information several beta versions delayed
> many months...
> Is it still continued the whole project?
> Best
>
> Martin

--
Kevin W. Eliceiri
Director, Laboratory for Optical and Computational Instrumentation (LOCI)
Departments Cell and Molecular Biology and Biomedical Engineering
Affiliate Principal Investigator, Morgridge Institute for Research (MIR)
Room 271 Animal Sciences, 1675 Observatory Drive, Madison, WI 53706
Phone: 608-263-6288
Johannes Schindelin Johannes Schindelin
Reply | Threaded
Open this post in threaded view
|

Re: ImageJ2... dead?

In reply to this post by offterdi1
*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy
*****

Hi Martin,

On Wed, 26 Mar 2014, Martin Offterdinger wrote:

> I am interested to hear if anyone is aware about the status of the
> ImageJ2 project?

Thank you for your interest!

> There were always some news on the website, but since several months it
> seems to be rather dead there? No information several beta versions
> delayed many months...

There have been some changes in the development team, and that is
naturally followed by a couple of weeks/months of restructuring the
project.

It is not dead, though! ;-)

> Is it still continued the whole project?

We concentrated a little bit more on infrastructure, to make it possible
to maintain this hugely modular project with less effort. We are almost
done with those changes and they already help us move faster, but of
course those changes took time.

We also concentrated on lower-level software architecture; For example,
there has been a recent, hugely successful hackathon hosted by KNIME in
Konstanz resulting in an elegant architecture to allow design of efficient
algorithms (we did not yet come around to finish up some loose ends and
send out a summary to the ImageJ mailing list; I am actually working on
this as we speak). This architecture will be used both by KNIME and
ImageJ2.

And we also focused more on use cases, such as running ImageJ 1.x plugins
in non-graphical settings. Just imagine a cloud application wanting to run
multiple ImageJ 1.x macros in parallel -- there are many scientists who
would benefit from such a setup. Due to ImageJ 1.x' implementation
details, this is not possible: macros would interfere with each other. The
ImageJ2 project came up with a (partial) solution called ij1-patcher. It
is one example where fundamental work done in the ImageJ2 project benefits
many more projects.

Let me add a teaser: we also work on making it possible to run ImageJ2
plugins in ImageJ 1.x. By necessity, this support will be limited (ImageJ
1.x can handle only three data types and only up to five dimensions and is
also limited by RAM in how large images it can handle), but it will most
likely let users benefit from all the work we have done.

So you see, we have been quite busy. The benefits are of course less
immediately visible to the user than a GUI redesign would be, but I am
convinced that it is important ground work we accomplish right now that
will serve us well for years to come. As a scientist, you will probably
appreciate that such fundamental work might look less sexy, but is
crucial for future developments.

Ciao,
Johannes
George McNamara George McNamara
Reply | Threaded
Open this post in threaded view
|

Re: ImageJ2... GPI and Ohi card parallel processing?

*****
To join, leave or search the confocal microscopy listserv, go to:
http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy
Post images on http://www.imgur.com and include the link in your posting.
*****

Hi Johannes and Kevin,

Bruce and Butte 2013 Optics Express
http://www.opticsinfobase.org/oe/fulltext.cfm?uri=oe-21-4-4766&id=24937*5
*http://tcell.stanford.edu/software.html *
*"For academic and government users, the software is free.*".* I
encourage Johannes, Kevin, and the ImageJ2 team to contact Manish to get
his B&B code and arrange to port it into ImageJ2 and especially in
headless mode. Public service announcement: see the !!! near the end of
my message here for running FFT deconvolution efficiently. *

*I reminded Kevin of this on Tuesday at ABRF (more on this below). I
have previously posted about this here. See
http://works.bepress.com/gmcnamara/37/ and links therein for raw and B&B
processed data of Stellaris FISH 3D widefield image data, and see
http://stellarisfish.smugmug.com/  for the source of our FISH probe sets.

Zanella R, Zanghirati G, Cavicchioli R, Zanni L, Boccacci P, Bertero M,
Vicidomini G. Towards real-time image deconvolution: application to
confocal and  STED microscopy. Sci Rep. 2013;3:2523.
doi:10.1038/srep02523. PubMed PMID: 23982127; PubMed Central PMCID:
PMC3755287.

Piotr Pawliczek posted and published:
http://pawliczek.net.pl/deconvolution/
**
<http://www3.interscience.wiley.com/journal/122581680/abstract>Parallel
deconvolution of large 3D images obtained by confocal laser scanning
microscopy, Piotr Pawliczek, Anna Romanowska-Pawliczek, Zbigniew Soltys,
Microscopy Research and Technique, 2009.
I note that MRT is now Alby's journal - maybe he can organize a special
issue on what various groups are doing for deconvolution with respect to
widefield, confocal, STED, etc.

UNC Clarity Deconvolution ... which appears to be dead with respect to
UNC development - maybe someone else can take over Clarity?

http://www.cs.unc.edu/techreports/09-001.pdf
http://cismm.cs.unc.edu/downloads/
- Clarity GPU deconvolution
- listserv members may -- or not -- find ImageSurfer useful
- a bunch of other software is on the download site (not alphebetized or
ordered in any other way)

A relatively new (not GPU or Phi'd to my knowledge) "low light level"
deconvolution was published by John Sedat and David Agard's labs in 2013:

High-resolution restoration of 3D structures from widefield images with
extreme low signal-to-noise-ratio.
<http://www.ncbi.nlm.nih.gov/pubmed/24106307>
Arigovindan M, Fung JC, Elnatan D, Mennella V, Chan YH, Pollard M,
Branlund E,Sedat JW, Agard DA.

Proc Natl Acad Sci USA 2013;110(43):17344-9. doi:
10.1073/pnas.1315675110. PMID: 24106307
and is now available at Agard's lab:
http://msg.ucsf.edu/IVE/Download/index.html

Note: They may not have optimized the settings for the deconvolution
software they compared ER-Decon to. It would be helpful if Arigovindan
et al posted all their raw data files online so anyone can try out any
algorithms.


///

With respect to ABRF, Patrick O'Farrell won the 2014 ABRF Award for
Outstanding Contributions to Biomolecular Technologies.
https://conf.abrf.org/abrf-award
Most of Pat's talk was on 2D-PAGE (see 24568796 for a recent write-up).
In the last few minutes Pat showed TALE-FPs lighting up two different
tandem repeats in every cell of Drosophila embryos. This was just
published in:


Illuminating DNA replication during Drosophila development using
TALE-lights. <http://www.ncbi.nlm.nih.gov/pubmed/24556431>

Yuan K, Shermoen AW, O'Farrell PH.

Curr Biol. 2014 Feb 17;24(4):R144-5. doi: 10.1016/j.cub.2014.01.023.

PMID:
    24556431


References to other TALE-FP and Cas9-FP papers can be found in the
download at
http://works.bepress.com/gmcnamara/42/

Small fluorescent foci, or for that matter, not so small, will benefit
from deconvolution. 2 or more color data sets will benefit from the
joint 3D spatial deconvolution and spectral unmixing (my terminology)
approach of Adam Hoppe et al (who acronymized it as 3DFSR ... his "F"
was for 3D-FRET, but this also works in multichannel fluorescence that
does not have FRET):

Hoppe AD, Shorte SL, Swanson JA, Heintzmann R. Three-dimensional FRET
reconstruction microscopy for analysis of dynamic molecular interactions
in live cells. Biophys J. 2008 Jul;95(1):400-18.
doi:10.1529/biophysj.107.125385. PubMed PMID: 18339754; PubMed Central
PMCID: PMC2426648.

See especially figures 5 and 6
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2426648/figure/fig5/
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2426648/figure/fig6/

See also their 2006 Proceedings,

http://www.pasteur.fr/recherche/unites/Pfid/html/publ/06_3.pdf

In 2008 with their 61 lines of unoptimized MatLab code,

http://sitemaker.umich.edu/4dimagingcenter/3dfsr

JSD/SU took a while. Adam is actively working on parallelizing it.
With GPU or Phi
http://www.intel.com/content/www/us/en/processors/xeon/xeon-phi-detail.html 
(see 7120 models)
or Phi's on a mini-cluster or TACC Stampede
https://www.tacc.utexas.edu/stampede/   or similar systems, this could
be very fast (though I think TACC made a mistake by not putting in
enough Phi and GPU cards to go to 11).

One note about Adam's paper and many other 3D deconvolution studies:
these usually use FFT's (and usually FFTW). FFT's are best done with a
power of 2, for examples, 8, 16, 32, 64, 128 ... 2048. Not using a power
of two leads to the need for the software to pad the data to a higher
power of two, which makes for both a bigger data set than you think, and
potential for artifacts. My lab's Hamamatsu FLASH card has 2048x2048
pixels, and I like to use them all. This works nicely for FFT's. The
other type of sCMOS, from Andor, Fairchild and PCO is a non-power of 2
pixels sensor, so I recommend using a power of 2 region of interest for
deconvolution. Even more important:

Use a power of 2 for the Z-planes !!!

Hoppe et al used 512x512x30 planes: 2 out of 3 axes is good, but not
good enough. My thanks to Manish Butte for emphasizing:
1. use powers of 2 for all three axes.
2. NVidia TITAN card has 6 Gb of RAM and is the card of choice (at least
until new models come out), instead of the current 3 Gb NVidia GeForce
GTX 780 and 780 Ti cards, as well as a slightly faster TITAN Black. They
did just annnounce 780 isare going to 6 Gb soon:
http://www.tomshardware.com/news/evga-gtx-780-6gb-nvidia,26377.html
and see also
http://www.anandtech.com/show/7897/nvidia-announces-geforce-gtx-titan-z-dualgpu-gk110-for-3000 


Hopefully the microscope vendors will start offering "instant
gratification" quantitative -- and multichannel (JSD/SU) soon, since
none does now. The "Deltavision" deconvolution mode in the OMX comes
close, except there is still need for the user to 'drop and drag' files
(at least by default) and an OMX costs a lot of money if just doing
deconvolution.

George



On 3/26/2014 9:21 AM, Johannes Schindelin wrote:

> *****
> To join, leave or search the confocal microscopy listserv, go to:
> http://lists.umn.edu/cgi-bin/wa?A0=confocalmicroscopy
> *****
>
> Hi Martin,
>
> On Wed, 26 Mar 2014, Martin Offterdinger wrote:
>
>    
>> I am interested to hear if anyone is aware about the status of the
>> ImageJ2 project?
>>      
> Thank you for your interest!
>
>    
>> There were always some news on the website, but since several months it
>> seems to be rather dead there? No information several beta versions
>> delayed many months...
>>      
> There have been some changes in the development team, and that is
> naturally followed by a couple of weeks/months of restructuring the
> project.
>
> It is not dead, though! ;-)
>
>    
>> Is it still continued the whole project?
>>      
> We concentrated a little bit more on infrastructure, to make it possible
> to maintain this hugely modular project with less effort. We are almost
> done with those changes and they already help us move faster, but of
> course those changes took time.
>
> We also concentrated on lower-level software architecture; For example,
> there has been a recent, hugely successful hackathon hosted by KNIME in
> Konstanz resulting in an elegant architecture to allow design of efficient
> algorithms (we did not yet come around to finish up some loose ends and
> send out a summary to the ImageJ mailing list; I am actually working on
> this as we speak). This architecture will be used both by KNIME and
> ImageJ2.
>
> And we also focused more on use cases, such as running ImageJ 1.x plugins
> in non-graphical settings. Just imagine a cloud application wanting to run
> multiple ImageJ 1.x macros in parallel -- there are many scientists who
> would benefit from such a setup. Due to ImageJ 1.x' implementation
> details, this is not possible: macros would interfere with each other. The
> ImageJ2 project came up with a (partial) solution called ij1-patcher. It
> is one example where fundamental work done in the ImageJ2 project benefits
> many more projects.
>
> Let me add a teaser: we also work on making it possible to run ImageJ2
> plugins in ImageJ 1.x. By necessity, this support will be limited (ImageJ
> 1.x can handle only three data types and only up to five dimensions and is
> also limited by RAM in how large images it can handle), but it will most
> likely let users benefit from all the work we have done.
>
> So you see, we have been quite busy. The benefits are of course less
> immediately visible to the user than a GUI redesign would be, but I am
> convinced that it is important ground work we accomplish right now that
> will serve us well for years to come. As a scientist, you will probably
> appreciate that such fundamental work might look less sexy, but is
> crucial for future developments.
>
> Ciao,
> Johannes
>
>    


--



George McNamara, Ph.D.
Single Cells Analyst
L.J.N. Cooper Lab
University of Texas M.D. Anderson Cancer Center
Houston, TX 77054
Tattletales http://works.bepress.com/gmcnamara/26/