You are here

2015/2016 Map Challenge

Goals

 

  • Establish a benchmark set of single particle raw image datasets suitable for high resolution cryoEM, suitable for both software developers and beginners
  • Encourage developers of 3DEM software packages and biological end users to analyze these datasets and present results with the best practice
  • Evolve criteria for evaluation and validation of the results of the reconstruction and analysis
  • Compare and contrast the various reconstruction approaches in a positive spirit, to achieve high efficiency and accuracy

   

Map Committee

Bridget Carragher (Chair), 
Jose-Maria Carazo, Wen Jiang, John Rubinstein, Peter Rosenthal, Fei Sun, Janet Vonck, Wah Chiu, Cathy Lawson, Ardan Patwardhan

How to Participate

All members of the Scientific Community--at all levels of experience--are invited to participate as Challengers, and/or as Assessors.

In the Challenge Phase (July 2015-Apr 2016), 27 participants created and submitted 66 reconstructions of the challenge targets using the supplied raw image data. Challengers were encouraged to perform their own movie frame alignment, frame summation, and particle picking. Alternately, they could begin with pre-aligned, summed images and/or original author-provided particle positions. For each submission, challengers filled out a questionnaire (overview) and provided the following data:

  • final unmasked map with filtering/sharpening
  • final unmasked map without filtering or sharpening
  • half-maps and mask used for FSC calculation
  • CTF, coordinates, euler angles for each particle image used in the reconstruction

In the Blinded Assessment Phase (Nov 2016-Apr 2017), six groups have contributed reports.  Following initial review period by the map committee, the challenge data and files are now publicly available (entry authorship and software suppressed) for anyone to assess. Assessment methods could include statistical analyses, resolution estimation, or coordinate model fitting. A few suggestions gathered from software developers are summarized below. The intention is to enable comparisons of the various packages available and their options in a positive spirit.  During this period, assessors are strongly encouraged to share their plans and short result summaries on this website using the Assessment Registration Form.  Assessment results will be more fully presented and discussed via a workshop (early October 2017) as well as via manuscript submissions to a Journal special issue.

Map Challenge Assessment Guidance

We've gathered some suggestions here about how to proceed with comparisons of the map submissions.* These are not meant to be prescriptive; results from other approaches are also welcome.

FSC Analysis

FSC curves based on provided half-maps and masks have been prepared for each map challenge entry (link). In most cases the results are consistent with or very close to the submitter-reported resolution, but this initial analysis cannot be used to directly compare submissions, because of differences in masking and map sizes, and thus convolution effects. FSC is a fundamental similarity metric, but its use in standard cryoEM practice has been problematic because of the maps being compared. Many suggestions were made on how to carry out follow-up analyses:  

  • Apply a single, common mask to all entries belonging to a target (e.g. 15-20 A average of all entries, with soft-edges or low-pass filtered).
  • Employ other methods/techniques such as: 
    • post-process phase randomization 
(e.g., to investigate effects of different masking on FSC)
    • mask artifact compensation
    • determine FSC error
    • Calculate map-model FSCs

Map Density Analysis

Images of each map both by itself and aligned to a common model are provided for reference (link), but further investigation is warranted, as variations in density appearance may be due to differences in power spectra and/or filtering/sharpening schemes.  Some suggestions:


  • Both overall images and close-up views are desirable; for comparison it is best to have the exact same view

  • Both well-ordered regions and not-so well ordered regions should be investigated 

  • Views containing slices (slabs or grey-scale planes) could be useful

  • Apply a common filtering/sharpening scheme to the unfiltered (raw) map entries for a target, bringing power spectra to a “common denominator” for density comparison

  • Along this line, view density across maps attenuated at a common low resolution, and then walk the attenuation towards higher resolution

  • Density quality could be investigated by fitting defined portions of each map using modeling tools (e.g. compare rmsd's of multiple models).


------------

*suggestion credits: Maya Holmdahl, Roberto Marabini, Sjors Scheres, Bernard Heymann, Niko Grigorieff, Pawel Penczek, Ed Egelman, Steve Ludtke, Scott Stagg, Marin van Heel

Timeline

 

JAN-JUN 2015
DEVELOPMENT PHASE
Jan-May Map Committee meets monthly to identify challenge targets, goals, and parameters
Mar-May Requests to 3DEM community members for public deposition of raw image Datasets; website development
July 1 Raw image data for all targets available for download at EMPIAR
JUL 2015-APR 2016
CHALLENGE PHASE
July Pre-Challenge Announcement, Challenger and Assessor Registration Opens
August 1 Map Submission Opens
Nov 9-20 Registered Participants may apply for SDSC Gordon Supercomputer Usage
April 1-2 Map Challenge Committee satellite discussion @ International CryoEM Image Analysis Symposium (Lake Tahoe, CA)
 Apr 15 Map Submission Closes @ 21:00 UTC
2016-2017 ASSESSMENT PHASE
May-August Challenge Data initial assessments, metadata extraction, preparation for release (Map Committee)
Sept-Oct
Review Period (Map Committee)
4 Nov 2016-30 Apr 2017 Assessors invited to perform analyses and comment on Released Data (Blinded)
6 June 2017 Map Submission Data UnBlinded
June - Sept 2017 Analysis of Assessments with full metadata
Oct 6-8 2017
2 day Workshop for all challenge participants -- Committee, Challengers, Assessors
Post-workshop Challenge Writeups (multiple articles) for  Journal Special Issue

Challenge Targets

Six challenge targets are based on recently described 3DEM single particle structure determinations with data collected as multiple-frames-per-second movies, using the latest generation of detectors. One additional target is based on simulated (in silico) images. For each experimental target, the original raw micrograph movie frames are data available for download at EMPIAR, PDBe's raw 3DEM image data archive. Summed image data are also available, either as full micrographs or as picked particle stacks. In one case aligned frames are also deposited.  Particle positions and defocus values from the raw data depositors are also available for download and may optionally be used by challengers in their reconstructions.

target 1. GroEL in silico 2. T20S Proteasome 3. Apo-Ferritin 4. TRPV1 Channel 5. 80S Ribosome 6. Brome Mosaic Virus 7. β-Galactosidase
Reference EMDB map entry

--

EMD-6287

EMD-2788

EMD-5778

EMD-2660

EMD-6000

EMD-5995

Primary Citation  Vulovic et al Campbell et al Russo & Passmore Liao et al Wong et al Wang et al Bartesaghi et al
Reported Resolution (Å) ~3 2.8 4.7 3.3 3.2 3.8 3.2

Reference Model

PDB entry

4hel

RCSB-PDB / PDBe / PDBj

 1yar

RCSB-PDB / PDBe / PDBj

4v1w

RCSB-PDB / PDBe / PDBj

 3j5p

RCSB-PDB / PDBe / PDBj

3j79/3j7a

RCSB PDB: LS, SS PDBe: LS, SS PDBj: LS, SS

3j7l

RCSB-PDB / PDBe / PDBj

5a1a

RCSB-PDB / PDBe / PDBj

Benchmark Storage Size

2 GB

2000 GB

181 GB

6300 GB

2000 GB

460 GB

550 GB

EMPIAR ID(s)

data can also be downloaded from Chinese Academy of Sciences

EMPIAR-10029

EMPIAR-10025

EMPIAR-10026

EMPIAR-10005

EMPIAR-10028

EMPIAR-10010 FAQ
EMPIAR-10011

EMPIAR-10013 EMPIAR-10012

Raw Frames

n.a.
Aligned Frames            
Summed Micrographs
     
Summed Particle Stacks      
Initial Particle Coordinates (directory link)
      spider FAQ   eman-box eman-box

Final Particle Coordinates (direct file link)

  relion-star relion-star   relion-star    

Particle coordinates in EMX format; python script used for conversion

files contributed by Roberto Marabini and Jose Maria Carazo

 

10025.emx

10025.py

10026.emx

10026.py

10005.emx

10005.py

10028.emx

10028.py

10011.emx

10011.py

10013.emx

10013.py

Imposed Symmetry

Dihedral (D7)

None (C1)

FAQ

Dihedral (D7) Octahedral (O)  Cyclic (C4) None (C1) Icosahedral (I)  Dihedral (D2)
 Sample MW (MDa) 0.8  0.7  0.44  0.3  4.2  4.6  0.47
 Unique MW (kDa) 57  50  20  80 4200 80  120
 Microscope -- Titan Krios  Polara 300  Polara 300 Polara 300  JEOL3200FSC  Titan Krios
Voltage(kV) 300 300 300 300 300 300 300
Cs (mm)
2.7 2.7  2.7 FAQ 2.0 2.0 4.1 2.7
Detector Falcon I K2 Falcon II K2 Falcon II DE12 K2
Frame Sampling (Å/pixel) 1.42 0.6575  1.346  1.22 FAQ 1.34  0.99 0.64
total dose (e-2) 50 53 16 41  20 52 45
dose per frame (e-2) -- 1.4 0.95 1.37 1 1.4 1.2
frame rate (f/s) -- 5 17 5 16 25 2.5
frame alignment method -- UCSF not performed UCSF Statistical DE script cross-correlation script
Particle selection method -- Appion-FindEM  EMAN2  SamViewer  EMAN2  EMAN2  Gaussian correlation
Number of Particles 10000 49954 483 35645 105247 30000 11726
Particle/Map Sampling (Å/pixel) 1.42 0.98 1.346 1.22 1.34 0.99 0.64
Raw Data Contributors (Thank You!) Yuchen Deng, Fei Sun Melody Campbell, Bridget Carragher Chris Russo, Lori Passmore Jean-Paul Armache, Maofu Liao, Yifan Cheng Xiaochen Bai, Sjors Scheres Zhao Wang, Wah Chiu Alberto Bartesaghi, Sriram Subramaniam

EMDataResource Validation Challenges are supported by NIH National Institute of General Medical Sciences

Please send your challenge questions, comments and feedback to challenges@emdataresource.org

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer