A collaborative platform for getting more out of AI research

Although we get a flood of AI research preprints on arXiv every week, around half of AI research preprints don’t come with code to facilitate reproducibility of results. How can we efficiently keep up with the fast-pace of AI research, and how do we know whether we can trust a preprint enough to invest our time into it?

We have more people taking AI MOOCs now than ever before, but what’s next after someone finishes a MOOC? Andrew Ng recommends people who finish his MOOCs to start reading and implementing papers [1], and that’s exactly what someone should do after completing an AI MOOC. On the other hand, researchers can benefit from having the wider community’s help with creating open-source code implementations of research preprints, while guiding these advanced learners in the process of understanding and implementation.

Our Solution

We started the free Nurture.AI research platform so that AI researchers, practitioners and advanced learners can get more out of AI research by facilitating discoverability of high-quality AI-related preprints and crowdsourcing open-source code implementations.

On the Nurture.AI research platform, you get to:

  • Create a personalised newsfeed of Tweets about AI-related arXiv preprints from people you follow, without the noise
  • Use community reviews to help you decide whether to invest your time on a preprint
  • Find and discuss code implementations contributed by the community on AI-related arXiv preprints
  • Highlight any text in a preprint to annotate and start a discussion thread

Discussions we hope to encourage on the platform through the annotations feature include:

  1. Questions and elaborations about terms and methods used in a research preprint, why a particular method was used, and whether the results could be improved by using another method
  2. Ideas for extending the research further
  3. Flagging results reported and choice of datasets
  4. Contrasting methods and approaches in other papers

Some further features we are considering to add onto the platform in the near future:

  • A way for users to request for open-source code implementations of any particular preprint
  • NLP-based and collaborative filtering recommendation engines

We would love to hear your thoughts on the current platform and what we are building, as well as any problems you face in learning AI or conducting AI research — our team is committed to making this the best possible AI research platform for you. Comment below, click on the “Feedback for Nurture.AI” icon on the bottom-right corner of the platform or email me at jiaqing@nurture.ai. We look forward to hearing from you, and don’t forget to clap if you like this!

Jia Qing is the Founder & CEO of Nurture.AI, a startup dedicated to bringing rigour and accessibility to AI research and education. Nurture.AI organised the Global NIPS Paper Implementation Challenge 2017 which saw more than 1000 participants from 40+ countries, and 46 successful code implementations of NIPS 2017 papers. We also catalysed free-to-attend weekly AI Saturdays benefiting more than 5000 people in cities globally like Seattle, Santiago, Kathmandu, Singapore, Bangalore etc., where participants go through AI lecture material actually used at universities like Stanford, UC Berkeley and UCL.

[1] Quora Question on “How can beginners in machine learning who have finished their MOOCs in machine learning and deep learning, take it to the next level and get to the point of being able to read research papers & productively contribute in an industry?”


A collaborative platform for getting more out of AI research was originally published in Nurture.AI on Medium, where people are continuing the conversation by highlighting and responding to this story.

Source: Deep Learning on Medium