The ASReview project makes use of standardized terminology for all communication regarding ASReview and its underlying technology. An overview of terms and usage can be found in the table below.




Means “Active learning for Systematic Reviews” or “AI-assisted Systematic Reviews”, depending on context. Avoid this explanation, only use as tagline.

ASReview project

Use ASReview project as an encompassing term for all work that is done by the ASReview team members and ASReview contributors.

ASReview LAB

Use to indicate the user-friendly interface that has been developed for researchers to use.

ASReview CLI

Use to indicate the command line interface that has been developed for advanced options or for running simulations studies.

team members

UU employees and students and who have permission to devote hours to the ASReview project.


Everyone contributing to the ASReview project (through GitHub)


Use to indicate additional elements to the ASReview software, such as the ASReview visualisation extension, or the ASReview CORD-19 extension.


Our Electronic Learning ASsistent. Name of our mascot. Use for storytelling and to increase explainability.



The human annotator who labels records.

Replacement term when context is PRISMA-based reviewing.


The data points that need to be labeled. The records can contain both information that is used for training the active learning model, and information that is not used for this purpose.

In the case of systematic reviewing, a record is meta-data for a scientific publication. Here, the information that is used for training purposes is the text in the title and abstract of the publication. The information that is not used for training typically consists of other metadata, for example, the authors, journal, or DOI of the publication.





All terms can be used to indicate the decision-making process on the relevancy of records (“irrelevant” or “relevant”).

Active learning model

Use to indicate how the next record to be screened by the user is selected.

The model consists of several elements: a query strategy, a feature extraction technique, a classifier, and a balance strategy.


Whenever the user decides that the reviewing process has been completed or if all records are labeled.


Whenever the data and ASReview project file are openly published on, for example, the OSF.