15th Community Wide Experiment on the
Critical Assessment of Techniques for Protein Structure Prediction
Group performance based on MODEL 1 analysis
  Model 1   Best model
    • All groups on 'all groups' targets
    • Server groups on 'all groups' + 'server only' targets
    • TBM easy (max gdt_ts >=50 )
    • TBM hard (max gdt_ts < 50 )
    • TBM/FM
    • FM
    • OTHER
    #     GR #     GR name     Domains Count     SUM Zscore
    (GDT_TS)
    AVG Zscore
    (GDT_TS)
    AVG GDT_TS     No.models
    ranked 1
    No.models
    in Top3
    No.models
    in Top10
    No.models
    GDT_TS>50
    No.models
    GDT_TS>80
1. 229 s Yang-Server 108 82.184 0.761 83.945 4 15 38 105 73
2. 162 s UM-TBM 109 80.776 0.741 83.720 8 20 40 105 77
3. 439 Yang 108 74.323 0.688 83.103 4 13 36 103 74
4. 278 PEZYFoldings 107 74.266 0.694 82.537 8 24 45 101 69
5. 074 DFolding 109 62.575 0.574 80.788 7 18 37 100 75
6. 180 McGuffin 109 61.198 0.561 79.843 4 12 37 100 68
7. 367 MULTICOM 109 55.940 0.513 80.531 2 7 23 102 72
8. 119 Kiharalab 109 55.444 0.509 79.613 1 4 28 102 68
9. 003 MULTICOM_human 109 55.111 0.506 80.329 2 6 26 102 71
10. 035 s Manifold-E 109 54.675 0.502 78.959 4 12 26 100 66
11. 248 Manifold 109 54.521 0.500 78.726 3 10 24 100 66
12. 185 BAKER 109 53.736 0.493 79.913 2 4 23 101 70
13. 475 s MULTICOM_refine 109 50.621 0.464 79.640 2 5 20 97 71
14. 008 bench 108 48.153 0.446 79.298 2 5 15 98 64
15. 120 s MULTICOM_egnn 109 47.673 0.437 79.380 2 4 21 98 70
16. 086 s MULTICOM_qa 109 46.537 0.427 79.209 0 2 19 98 71
17. 158 s MULTICOM_deep 109 46.481 0.426 79.256 1 4 20 98 70
18. 288 s DFolding-server 109 46.008 0.422 78.130 2 14 19 95 70
19. 037 Wallner 109 44.165 0.405 77.510 5 9 19 94 70
20. 166 s RaptorX 109 44.056 0.404 78.342 2 8 15 96 71
21. 225 ShanghaiTech 109 42.833 0.393 76.261 5 9 24 91 65
22. 462 s MultiFOLD 109 42.521 0.390 76.122 1 6 22 92 66
23. 320 Elofsson 92 42.348 0.460 79.409 4 10 22 82 60
24. 360 MUFold_H 109 41.852 0.384 77.692 2 5 11 95 67
25. 461 colabfold_human 109 41.735 0.383 76.001 4 6 22 91 67
26. 494 Venclovas 74 41.231 0.557 82.679 6 10 20 71 50
27. 446 s ColabFold 109 40.895 0.375 75.942 4 7 22 91 66
28. 092 Agemo_mix 109 39.304 0.361 76.439 2 8 20 92 66
29. 204 Asclepius 108 38.646 0.358 76.783 4 9 17 91 67
30. 125 s UltraFold_Server 109 37.306 0.342 76.464 2 5 15 93 68
31. 054 UltraFold 107 37.050 0.346 76.548 2 5 15 92 68
32. 208 B11L 102 36.962 0.362 76.554 3 3 14 89 56
33. 298 s MUFold 109 36.884 0.338 76.740 3 4 10 94 66
34. 383 s server_124 109 36.509 0.335 76.627 2 3 10 94 66
35. 188 s GuijunLab-DeepDA 109 36.374 0.334 76.727 1 3 9 94 66
36. 403 s server_126 109 35.930 0.330 76.651 1 3 10 93 65
37. 466 s Shennong 105 35.614 0.339 76.870 2 4 12 92 62
38. 399 BeijingAIProtein 103 35.376 0.343 77.072 2 5 15 90 65
39. 131 s Kiharalab_Server 109 34.670 0.318 74.326 1 2 14 89 65
40. 353 s hFold 106 34.564 0.326 77.338 0 3 6 91 64
41. 098 s GuijunLab-Assembly 109 34.559 0.317 76.213 1 2 9 93 63
42. 169 GuijunLab-Human 107 34.547 0.323 76.261 1 2 10 91 64
43. 477 DMP 96 33.979 0.354 79.105 4 9 17 86 62
44. 342 hFold_human 109 33.374 0.306 76.517 0 3 7 94 64
45. 270 s NBIS-AF2-standard 109 33.007 0.303 76.042 2 6 9 93 62
46. 398 ChaePred 107 32.690 0.306 75.566 2 3 9 91 60
47. 245 s FoldEver 109 32.618 0.299 75.114 1 3 8 91 62
48. 282 s GuijunLab-Threader 109 32.450 0.298 75.953 0 2 7 92 66
49. 151 s IntFOLD7 109 32.211 0.296 74.359 4 6 17 91 60
50. 018 s server_123 109 31.775 0.292 75.015 0 4 12 91 64
51. 441 OpenFold 108 31.741 0.294 74.126 2 3 13 89 64
52. 433 OpenFold-SingleSeq 108 31.738 0.294 74.119 2 3 13 89 64
53. 261 s server_122 109 31.629 0.290 75.321 0 2 9 91 63
54. 423 trComplex 108 31.457 0.291 73.986 2 4 11 88 61
55. 481 s GuijunLab-Meta 107 31.453 0.294 75.774 0 3 9 91 60
56. 187 TRFold 108 30.770 0.285 74.050 2 3 9 87 63
57. 264 s server_125 109 30.670 0.281 75.265 0 2 9 91 63
58. 089 s GuijunLab-RocketX 108 30.154 0.279 75.132 0 3 9 92 60
59. 385 FoldEver-Hybrid 101 29.585 0.293 76.748 1 3 7 87 61
60. 165 FTBiot0119 109 29.265 0.268 72.871 1 3 12 89 53
61. 434 Coqualia 104 28.988 0.279 74.728 0 4 11 86 58
62. 117 QUIC 103 28.212 0.274 74.559 2 2 5 87 58
63. 073 s DFolding-refine 106 26.983 0.255 74.394 1 6 9 94 58
64. 227 GinobiFold 106 26.952 0.254 73.936 0 2 8 88 55
65. 011 s GinobiFold-SER 105 26.860 0.256 74.197 1 4 7 88 58
66. 133 s ShanghaiTech-TS-SER 105 26.739 0.255 73.847 0 2 10 87 57
67. 455 Seder2022easy 106 26.396 0.249 72.979 2 5 12 88 59
68. 215 s XRC_VU 80 26.133 0.327 76.632 1 1 7 68 48
69. 269 AP_1 109 25.979 0.238 73.102 0 2 3 91 58
70. 450 s ManiFold-serv 109 25.818 0.237 74.260 0 0 5 93 59
71. 354 hks1988 109 25.380 0.233 74.060 0 2 4 92 61
72. 097 Graphen_Medical 73 23.924 0.328 75.046 2 6 14 60 43
73. 443 s BAKER-SERVER 109 23.513 0.216 71.731 0 1 2 87 53
74. 374 Zheng 45 23.255 0.517 87.596 2 6 15 45 36
75. 239 s Yang-Multimer 45 23.107 0.513 88.398 1 3 14 45 38
76. 275 Bhattacharya 109 22.591 0.207 73.928 1 1 3 92 58
77. 216 Seder2022hard 94 21.628 0.230 74.308 1 4 7 77 54
78. 071 s RaptorX-Multimer 45 20.184 0.449 86.893 2 5 12 45 38
79. 390 s NBIS-AF2-multimer 50 19.770 0.395 85.205 3 6 10 49 36
80. 147 SHT 96 19.019 0.198 73.347 0 1 4 83 49
81. 348 Takeda-Shitaka_Lab 45 18.203 0.405 86.179 2 3 8 45 35
82. 234 Panlab 109 16.643 0.153 69.858 0 0 2 90 45
83. 067 ESM-single-sequence 93 16.581 0.178 68.674 2 4 8 71 42
84. 091 UNRES 103 15.879 0.154 69.472 0 1 1 77 53
85. 150 Grudinin 44 15.773 0.358 84.085 2 4 7 42 32
86. 276 PICNIC 102 15.267 0.150 70.293 2 2 4 83 48
87. 257 WL_team 93 14.906 0.160 71.877 0 1 3 82 39
88. 350 ClusPro 42 14.613 0.348 84.612 1 2 6 40 31
89. 478 Agemo 109 13.970 0.128 57.634 1 2 4 70 20
90. 444 CoDock 47 13.876 0.295 84.102 0 1 6 46 36
91. 314 Pierce 27 12.376 0.458 84.256 1 2 3 26 18
92. 291 Kozakov-Vajda 28 9.890 0.353 81.846 1 1 4 26 19
93. 205 Zou 41 9.552 0.233 81.671 1 2 3 39 27
94. 447 DELCLAB 97 9.069 0.093 55.647 0 0 1 60 33
95. 362 MESHI 70 9.045 0.129 76.292 0 1 1 63 41
96. 427 s MESHI_server 76 9.032 0.119 69.244 1 1 3 61 30
97. 493 Shen-CAPRI 34 8.892 0.262 82.071 0 0 5 32 22
98. 123 RostlabUeFOFold 82 8.774 0.107 55.159 1 2 2 52 13
99. 064 SHORTLE 51 8.644 0.169 77.446 0 0 1 44 30
100. 312 Fernandez-Recio 34 7.606 0.224 82.497 1 2 2 34 23
101. 219 s Pan_Server 104 5.878 0.057 51.859 0 0 1 57 20
102. 199 TB_model_prediction 13 4.747 0.365 90.924 1 1 2 13 11
103. 304 Manifold-X 18 4.463 0.248 86.787 0 0 0 18 13
104. 325 AIchemy_LIG 13 4.284 0.330 89.713 1 1 2 13 10
105. 347 AIchemy_LIG3 13 4.284 0.330 89.713 1 1 2 13 10
106. 456 AIchemy_LIG2 13 4.284 0.330 89.713 1 1 2 13 10
107. 132 TensorLab 11 3.907 0.355 90.010 0 0 3 11 10
108. 140 EMBER3D 92 3.805 0.041 48.781 0 1 1 49 4
109. 397 bio3d 2 3.349 1.674 84.888 1 1 1 2 2
110. 122 zax 8 3.129 0.391 91.196 1 1 3 8 8
111. 052 Gonglab-THU 109 3.116 0.029 42.468 0 0 1 49 2
112. 315 s Cerebra 109 3.015 0.028 42.259 0 0 1 49 1
113. 046 s Manifold-LC-E 15 2.986 0.199 85.266 0 0 0 15 10
114. 370 s wuqi 87 2.889 0.033 61.500 0 0 1 65 18
115. 498 Spider 33 2.837 0.086 47.658 1 1 2 16 7
116. 368 s FALCON2 107 2.721 0.025 45.647 0 0 1 50 7
117. 333 s FALCON0 107 2.721 0.025 45.202 0 0 1 49 7
118. 352 KORP-PL 8 2.425 0.303 81.798 0 0 3 7 6
119. 201 UTMB 6 2.395 0.399 88.848 0 0 1 6 5
120. 338 Convex-PL 6 2.253 0.376 86.923 0 0 2 5 5
121. 212 s BhageerathH-Pro 103 2.027 0.020 39.022 0 0 0 39 13
122. 236 noxelis 5 1.925 0.385 94.119 0 0 0 5 5
123. 472 ddquest 5 1.924 0.385 93.977 0 0 1 5 5
124. 460 Convex-PL-R 6 1.884 0.314 86.356 0 0 1 5 5
125. 412 MeilerLab 3 0.725 0.242 73.127 0 0 0 3 0
126. 014 FEIGLAB 3 0.467 0.156 73.270 0 0 1 3 0
127. 088 coco 2 0.345 0.172 93.871 0 0 0 2 2
128. 366 GatorsML 3 0.332 0.111 62.608 0 0 0 3 0
129. 280 s ACOMPMOD 78 0.220 0.003 28.871 0 0 0 21 2
130. 285 PerezLab_Gators 3 0.144 0.048 71.930 0 0 0 3 0
131. 488 CSRC_ICM 1 0.000 0.000 49.786 0 0 0 0 0
132. 006 Sun_Tsinghua 22 0.000 0.000 18.072 0 0 0 0 0
Protein Structure Prediction Center
Sponsored by the US National Institute of General Medical Sciences (NIH/NIGMS)
Please address any questions or queries to:
© 2007-2022, University of California, Davis