SPIDER can now load and create MRC format image and volume files. Just specify the full file name including the extension e.g. file.mrc or 'file.mrcs' anywhere that SPIDER requests a filename. For MRC stacked images you can use: 1@file.mr. For prompts that request a template use: *@file.mrc. Further information and notes on the drawbacks of using MRC stacks are given here.
It is now obvious that most of the advance in resolution of cryo-em reconstructions since 2013 came from the use of direct electron capture cameras and not from reconstruction software improvements. The report from the 'EM Databank Map Challenge', which I participated in, supports this claim.
However softwares like: Cryosparc, Relion, EMAN2, SPARC, Xmipp, Bsoft, and SIMPLE offer increased speed, convenient interfaces, and often database connectivity improvements not available in SPIDER. Due to age and design differences these types of improvements are often impossible to add to SPIDER.
SPIDER continues to be used for complete single particle reconstructions from microscope frame alignment thru refinement. It can yield reconstructions close in resolution to those from the above software even though it does not use 'maximum likelihood' methodology and some other advances. However it will be far slower than GPU accelerated Cryosparc or Relion. It also has a command line interface which some younger users find old-fashioned.
SPIDER contains software for reconstruction from tilt pair imagery, particle picking, classification capabilities, and a wide selection of general image processing operations which are not available in these other packages. These methods are valuable for tasks like creating particle masks and understanding particle heterogeniety. They are also valuable in teaching environments.
What is the future role of SPIDER? It is already obvious that funding is not available for upgrades and improvements. Does anyone out there care? Where do we go from here? Do you see any continued use of SPIDER in your laboratory.
The feasibililty of using GPU's and CUDA for speeding up reconstruction has changed significantly since I last described our efforts. Recent generations of GPU's from Nvidia and AMD load data at a much faster speed and a much larger memory size is available. It is no longer necessary to create a suite of approaches to parallelize different data set sizes and shapes. Implementations in Relion and more so in Cryosparc (using a different approach to alignment search) illustrate the speed-up that is possible. We should revisit GPU use within SPIDER, but I won't.
Both Science and C&E News have acknowledged the current 'revolutionary advance' in cryo-electron microscopy single-particle reconstruction.
These advances in resolution of reconstructions use new direct electron capture cameras and publications, that I have seen, utilize Relion software for the reconstruction.
I am still uncertain how much of the improved resolution arises from the improved software. At issue are not only the reconstruction methodology but also the resolution metric.
If Relion is a significant source of the improvement then there arises a question of the future role of other softwares in reconstruction. Currently Relion is able to handle most of the reconstruction pathway except for particle selection (windowing) and initial reference model construction.
These other softwares include: SPIDER, EMAN2, SPARX, Xmipp, IMAGIC, Bsoft, and SIMPLE and some others. These softwares still contain some capabilities not found in Relion. e.g.
With the exception of these capabilities what is the future function of these softwares? Will they survive Relion's ascent? How much future development should be done on them? What will be the impact on funding for software other than Relion?
EM Software development funding by NIH in the US is currently in a rather bad state. Both SPIDER and IMOD and its associated software have lost major or all of their funding. At NIH almost all software development grants, for widely different purposes, compete directly and also compete with funding for various biological databases. This lack of targeting leads to poor quality reviewing.
E.g. In the case of SPIDER one of three reviewers of our most recent grant application stated:
"the number of investigators employing SPR is limited and not expected to grow substantially".It is difficult for me to see how a knowledgeable reviewer could come to such a conclusion in the midst of a 'revolutionary advance'.
There does not appear to be any viable non-grant mechanism for the continued maintenance of scientific "Free Open Source Software". Is it reasonable to hope that researchers will direct voluntary monetary donations to software developers as some have suggested? Can researchers even get such a contribution approved by their local grant administrators? Do their auditors OK such an unobligated contribution? There are additional problems with currency conversion. Certainly the red tape involved in both donating and accepting a donation conspire against this idea. Up until now most software development has existed as sort of a side-operation of previously fairly well funded EM labs, in our case a 'NIH research resource'. Such funding is increasingly at risk and long-term development and maintenance of software is disappearing.
This uncertainty in funding confounds discussion on the future of EM software. Where do we go from here? Do you see continued use of SPIDER and other softwares?
29 Nov. 2012 ArDean Leith
A single particle reconstruction from cryo-EM images of non-symmetrical objects often requires 100,000 --> 1,000,000 images. If such a large number of images are stored in most common Linux filesytems, accession / addition of images will cause thrashing of the filesytem and extemely slow access. This occurs not just in processes accessing the images but throughout all access to that file system.
To overcome this thrashing one can purchase an expensive parallel file storage system (e.g. from Panasas) or more commonly aggregate the images into 'stacks', or a less commonly into a database. Most EM softwares support some sort of file based stack. Several different EM single particle reconstruction softwares support both MRC and SPIDER format files to various extents.
The MRC stack file format is an especially poor choice for your stacks. There is a single 1024 byte header for the whole stack, then individual images are concatenated into the stack without any image specific header..
4 Sept. 2012 ArDean Leith
We recently introduced improved interpolation using FBS inside several SPIDER operations. We have shown that FBS gives significant improvements over the linear and quadratic interpolation used in SPIDER previously and is as good as the much slower gridded interpolation available in SPARX.
During refinement of a reference based reconstruction interpolation is used at four steps. These are: creation of reference images from an existing reference volume, application of existing alignment parameters to the experimental images, conversion of image rings to polar coordinates, and alignment of images prior to back projection into a volume.
When we modified our recommended procedure for refinement recon-loop.spi using the FBS interpolation alternatives in SPIDER and tested the refinement step using actual cryo-em data we were perplexed to find a small but repeatable decline in reconstruction resolution of an overall refinement step.
We investigated this decline using a ribosome data set consisting of four sets of noisy experimental images taken at different defocus levels containing over 6000 images. The decrease in resolution is caused by the application of existing alignment rotation and translations to the experimental images, before these images are compared to the reference projections for determination of the best matching pairs. The 'RT SQ' operation uses quadratic interpolation which adds an asymmetric filter effect to the results. This filtration ended up cutting noise in the aligned experimental images so that they gave better choice of matching reference images. Poorer interpolation gave a better outcome! But this observation pointed to a method of improving the refinement step. We have added a option to denoise the experimental images prior to the reference comparison in the 'AP SHC' operation. We evaluated Fourier lowpass, averaged box convolution, median box convolution, mean shift denoising, and anisotropic diffusion denoising before settling on Fourier lowpass filter as giving the best resolution results.
We have modified our recommended refinement procedure to use FBS interpolation in: 'PJ 3F' for the creation of the reference projections, 'AP SHC' during application of existing alignment parameters to the experimental images, and in 'RT SF' for creating the view used for backprojection. We also used FBS interpolation during conversion of images rings to polar coordinates. These improvements which are present in recon-loop.spi gave a significant improvement in resolution over the course of a complete refinement series compared to our previous procedure.
29 Aug. 2012 ArDean Leith
We have developed a 2D and 3D Fourier-based Spline Interpolation Algorithm (FBS) in order to improve the performance of rescaling, rotation, and conversion from Cartesian to polar coordinates. In order to interpolate a two- or three-dimensional grid we use a particular sequential combination of correspondingly two and three 1D cubic interpolations with Fourier derived coefficients. A 1D cubic interpolation is a third degree polynomial:
Y(X)=A0 + A1*X + A2*X2 + A3*X3
where polynomial coefficients A0, A1, A2, and A3 are calculated from the Fourier transform of the image: A0 = Y(0)
A1 = Y'(0)
A2 = 3(Y(1) -Y(0) - 2Y'(0) - Y'(1)
A3 = 2(Y(0) -Y(1)) + Y'(0) + Y'(1)
The derivatives at grid nodes were obtained using well-known relation between Fourier transforms of the derivative and the Fourier transform itself:
F((d)f(x,y)/(d)x) = i*2*pi*k*F(k,l)
where F(k,l) is a coefficient of discrete Fourier transform series F(f(x,y))
This allows us to calculate derivatives in any local point without a finite difference approximation involving the data from neighboring points.
We compared FBS to other commonly used interpolation techniques, quadratic interpolation and convolution reverse gridding (RG). A rotation of images by FBS interpolation takes roughly 1.1-1.5 as long as quadratic interpolation, but achieves dramatically better accuracy. The accuracy of FBS interpolation is similar to RG interpolation. However, FBS rotation is approximately 1.4-1.8 times faster than RG. FBS algorithm combines the simplicity of polynomial interpolation and ability to preserve high spatial frequency. Currently it has been incorporated into several operations in the open source package SPIDER for single-particle reconstruction.
9 Mar. 2011 ArDean Leith
Since CPU hardware speeds are stagnant or decreasing there is increased interest in optimizing SPIDER's processing speed. Since SPIDER is a general purpose EM imaging package this means different things to different users. Locally the biggest time demand for our single particle reconstructions is alignment of images with reference projections (SPIDER operations: 'AP SH' and 'AP REF'). In order to access effect of changes in compiler options I used the operation: 'AP SHC' which is the latest highly 'tweaked' version of 'AP SH'). Usual data was a set of 375x375 pixel images and a comparison of 50 experimental images versus 550 references.
Source: random.html Page updated: 20 Jan. 2020 ArDean Leith