微软研究院图像识别挑战赛 MSR Image Recognition Challenge (IRC)

MSR Image Recognition Challenge (IRC)

Microsoft Research is happy to continue hosting this series of Image Recognition (Retrieval) Grand Challenges. Do you have what it takes to build the best image recognition system? Enter these MSR Image Recognition Challenges in ACM Multimedia and/or IEEE ICME to develop your image recognition system based on real world large scale data.

Current Challenge: MS-Celeb-1M: Recognizing One Million Celebrities in the Real World

Details: MSR Image Recognition Challenge @ ACM MM 2016       

Last Challenge: MSR IRC @ IEEE ICME 2016

We just finished the evaluation! More details here

Rank

TeamID Team Name Precision@5 Used External Data
1 30 NLPR_CASIA  89.65% Yes
2 16 ybt_bj  86.90% No
3 5 NFS2016  85.00% Yes
4 20 WestMountain  84.75% Yes
5 3 rucmm  84.55% Yes
6 17 CASIIE-Asgard  83.40% Yes
7 31 GoRocketsGo  81.85% Yes
8 2 CDL-USTC  73.10% Yes
9 4 lyg  71.35% No
10 10 FrenchBulldog  71.25% No

 

Past Challenge: MSR-Bing IRC @ ACM MM 2015

We have finished the challenge in ACM MM 2015. More details here.

  • Important Dates:
    • Dataset available for download (Clickture-Lite) and hard-disk delivery (Clickture-Full).
    • June 18, 2015: Trial set available for download and test.
    • June 24, 2015: Final evaluation set for Task#1 available for download (encrypted)
    • June 26, 2015: Evaluation starts (0:00am PDT
    • June 27, 2015: Evaluation ends (0:00am PDT) 
    • June 28, 2015: Evaluation results announce. 
    • July 7, 2015: Paper submission deadline
    • July 27, 2015: Notification of Acceptance:
    • August 15, 2015: Camera-ready Submission Deadline
    • October 28,2015: Grand Challenge Workshop
  • Latest updates: 

    • May 22, 2015: Pre-registration form available at http://1drv.ms/1K9aAxo.
    • June 11, 2015: Training data set ready for downloading: details
    • June 18, 2015: Trial set for Task#1 is available for download (the same as ACM MM 2014):http://1drv.ms/1pq08Wq
    • June 18, 2015: Trial code samples for Task#2 is delivered by email. Contact us if you haven't received it.
    • June 19, 2015: Test tool for Task#2 is delivered by email. Contact us if you haven't received it.
    • June 24, 2015: Evaluation set for Task#1 available at here (encrypted), please download and unzip it
    • June 24~June 25,2015: For task#2, dry run traffic will be sent to your recognition service,please keep your recognition service running!
    • June 26, 2015: Password to decrypt Task#1 evaluation data is delivered to all participants by email on 0:00am PST, please let us know if you haven't received it.
    • June 28, 2015: evaluation results are sent back to teams
    • July 1, 2015: evaluation result summary:

 

TeamID

TeamName

Task1: Image Retrieval

Task2: Image Recognition

Run1-Master

Run2

Run3

Rank-Task1

Accuracy@1

Arruracy@5

Rank-Task2

1

TINA

 

 

 

 

 

 

 

2

rucmm

0.52006239

0.489675

0.492945

1

42%

71%

2

3

SSDUT

 

 

 

 

 

 

 

4

AmritaLearning

 

 

 

 

 

 

 

5

HIK

 

 

 

 

 

 

 

6

DeepIR

 

 

 

 

 

 

 

7

IVA

0.471570894

0.462925

0.463261

3

57%

85%

1

8

VMA

 

 

 

 

 

 

 

9

WJ-QCZ

0.486851763

 

 

2

 

 

 

Random

0.425987601

 

 

 

 

 

 

Groundtruth

0.692381702

 

 

 

 

   

 

Past Challenge: MSR-Bing IRC @ ICME 2015 

  • Important dates :

    • April 21: Final evaluation set available for download here (encrypted)
    • April 24: Evaluation starts (password for decrypt the evaluation set delivered at 2:30am on April 24, PDT)
    • April 25: Evaluation end at 3:00AM PDT (very beginning of April 25). Result submission due.
    • April 28: Evaluation results has been sent to corresponding teams.
    • May 1, 2015: Paper submission (please follow the guideline of the main conference)
    • May 10, 2015: Notification
    • May 15, 2015: Paper camera ready due
  • Updates:

Past Challenge: MSR-Bing IRC @ ACM MM 2014

More details about the challenge, please visit:

1. The grand challenge page at ACM Multimedia 2014
2. IRC @ MM 14 at this site

Latest announcement will be posted here. 

Updates:

  • July 5: Evaluation results:

 

  • June 26: Due to many requests, the MM14 grand challenge submission deadline was extended for a week. So we also extend MSR-Bing challenge result submission deadline for one week. Please check the updated dates below.
  • June 25: Encrypted evaluation dataset is available for download now:http://1drv.ms/1lfawui. Please follow the below steps to submit your prediction results:
    1. Register a "paper" entry at https://cmt.research.microsoft.com/IRC2014. Make sure to finish this step ASAP (at the latest 30 minutes before the challenge starts). Password to decrypt the evaluation set will be set through CMT.
    2. Download the encrypted evaluation dataset. Please note the downloaded file was zipped twice (once with a password and once not).
    3. Unzip the downloaded file (without password) to make sure the file is not corrupted.
    4. Unzip the file you get from Step C with the password that will be sent to you through CMT. You will then get two files: one is a (key, image thumbnail) table, and the other is a (key, label) table. Please refer to this page know the details how to do generate prediction results.
    5. Before the end of the challenge, submit your prediction results (up to 6 zipped files - see instructions below).
    6. Submit your grand challenge paper according to the guideline in the ACM Multimedia 2014 website. Please note the CMT site is only for prediction results submission. Your paper should be submitted to EasyChair paper system. Make sure that you include your evaluation results in the paper (which will be sent to you before the paper submission deadline).
  • June 25: Evaluation set will be available by EOD today. CMT will be also online at the same time. Instructions: You are requested to register an entry at the CMT site to receive the password to decrypt the evaluation set as well as submit your prediction results. Please note prediction results based on Clickture-Lite (1M images) are mandatory, while the results on Clickture-Full (40M images) are optional. When submitting the prediction results, please name the files appropriately so we know which are based on 1M dataset (include "1M" in the file name) and which are based on 40M dataset (include 40M in the file name), as well as which are master runs (include "master" in the file name). If you submitted results based on both datasets, you are allowed to submit three runs for each dataset (including one master run for each dataset). Please note final evaluation will be based on the master runs though we will also return you the scores for other runs. (New!)
  • June 25: Evaluation starts and ends dates changed (1 day delay).
  • June 19: Trial set is available here: http://1drv.ms/1pq08Wq  (New!)

Schedule (updated on June 26):

  • Feb 15, 2014: Dataset available for download (Clickture-Lite) and hard-disk delivery (Clickture-Full).
  • June 18: Trail set available for download and test.
  • June 25: Final evaluation set available for download (encrypted)
  • July 3 (updated/firm): Evaluation starts (password for decrypt the evaluation set delivers at 11:30pm on July 2, PDT)
  • July 4 (updated/firm): Evaluation end at 0:00AM PDT (very beginning of July 4)/Result submission due
  • July 5: Evaluation results announce.
  • July 6, 2014: Paper submission (please follow the guideline of the main conference)

Links to the Challenges at Difference Conferences:

from: http://research.microsoft.com/en-us/projects/irc/
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值