Oracle Mode

This is a quick tour in using the ASReview LAB software in Oracle Mode, which is the user-friendly frontend for active learning in systematic reviews for unlabeled data with interaction by the user. A more elaborate instruction can be found in this blogpost on the ASReview website.

This tutorial assumes you have already installed Python and ASReview. If this is not the case, check out the Installation page. Also, you should have created a project.

Select Dataset

Select the dataset you want to use, which should contain at least the titles and/or abstracts of all documents (records) you want to screen.

There are four ways to select a dataset:

ASReview dataset selector

After a successfull upload of the data, move to the next step.

ASReview dataset uploaded

Select Prior Knowledge

The first iteration of the active learning cycle requires some prior knowledge to work. This knowledge is used to train the first model. In this step you need to provide at least one relevant and one irrelevant document. To facilitate this, it is possible to search for specific records within your dataset (for finding prior relevant papers), ask the software to present a couple of random documents (for prior irrelevant papers), or to upload partly labeled data.

ASReview prior knowledge selector next

Select Active Learning Model

In the next step of the setup, you can select a model. The default setup (Naïve Bayes, tf-idf, Max) overall has fast and excellent performance, but many more options are davaialble . After choosing your model, click on Finish. You will return to the project page and the model is trained for the first time.

ASReview model

Start Reviewing

As soon as the model is ready, a button appears with Start Review. Click the button to start screening. ASReview LAB presents you a document to screen and label. If you have selected certainty-based sampling it will be the document with the highest relevance score.

You are asked to make a decision: relevant or irrelevant?

ASReview Screening

While you review the documents, the software continuously improves its understanding of your decisions, constantly updating the underlying model.

As you keep reviewing documents and providing more labels, the number of unlabeled docuemtns left in the dataset will decline. When to stop is left to the user and we provide some tips in our blogpost.

Download Results

During the screening or via the dashboard you can download the results with your decisions by clicking the download icon. A dialog will show the download options. Choose from the menu whether you would like to download your results as a CSV or an Excel file and click Download.

ASReview project download

Return to Project Dashboard

If you want to return to the project dashboard, click the hamburger menu (top left) and click Project Dashboard.