Conference Agenda

To read the abstracts of submissions, click on the title of the session at the top of the cell, not on the title of the submission.  

Session Overview
O 1.1: Online Discussion Information Retrieval
Thursday, 09/June/2022:
2:00pm - 2:30pm

Session Chair: Louise Farragher

The abstracts will not be presented live during this session. You can are advised to view the recorded presentations before this session. Presenters will briefly introduce themselves and then discuss their research, and answers questions.

Show help for 'Increase or decrease the abstract text size'
ID: 156 / O 1.1: 1
Online Oral Presentation
Topics: Information retrieval and evidence syntheses

Exploring the impact of searching on priority screening in broad topic areas

Claire Stansfield, Alison O'Mara-Eves

EPPI-Centre, UCL Social Research Institute, University College London, United Kingdom

Aim: To compare the performance of priority screening in terms of the number of citations needed to screen at different levels of recall of relevant citations.

Method: A series of retrospective analyses were undertaken to compare the performance of priority screening between batches of references that differ by their sensitivity. We examined the effect of screening volume and topic breadth for four review topics where research terminology is diverse (digital interventions for reducing alcohol and drug use, lyme disease research, online content of eating disorders, and embedded researchers). Simulations of priority screening were undertaken in EPPI-Reviewer and results analysed in Excel.

Results: Measures: the average number of references to be screened at 90, 95, 99 and 100% recall of the included studies. We will present the results of the analysis, which is ongoing. Early indications from one case study indicates that to achieve above 95% recall, manual screening is considerably increased for lower precision searches.
Conclusions: In some cases, broader searches are more feasible than designing a narrower search, even if the latter can achieve potentially higher recall with fewer records screened, owing to the complexity of designing sensitive narrow searches. This analysis is informative of the opportunities and limitations of taking this approach.

Human touch

This research improves understanding between the implications of search strategies and the volume of results on utilising machine learning technology to reduce the human effort in manual screening.

Biography and Bibliography

Claire Stansfield is an information specialist and Senior Research Fellow with research interests in information retrieval methods for systematic research reviews that inform public policy.
Alison O'Mara-Eves is an Associate Professor and specialises in methods for systematic reviews and meta-analysis. Her interests include evaluating the use of emerging text mining techniques for facilitating the production of systematic reviews.

Bibliography from related research:

Stansfield, C, Stokes, G, Thomas, J. Applying machine classifiers to update searches: Analysis from two case studies. Res Syn Meth. 2021; 1- 13. doi:10.1002/jrsm.1537

O'Mara-Eves A, Thomas J, McNaught J, Miwa M, Ananiadou S. Using text mining for study identification in systematic reviews: a systematic review of current approaches. Syst Rev. 2015 Jan 14;4:5. doi: 10.1186/2046-4053-4-5. Review. Erratum in: Syst Rev. 2015;4:59

Stansfield-Exploring the impact of searching on priority screening-156_a.mp4

ID: 1223 / O 1.1: 2
Oral Presentation
Topics: Information retrieval and evidence syntheses

Developing searchRxiv: An international transdisciplinary repository for search strategies

Neal Haddaway2, Melissa L. Rethlefsen1, Cristina Ashby3

1University of New Mexico Health Sciences Library & Informatics Center, United States of America; 2Stockholm Environment Institute; 3CABI

Documenting, saving, and sharing search strategies is an important component of transparent reporting for systematic reviews. It is also helpful for individual practice, enabling valuable search “blocks” on a myriad of topics to be shared, modified, and reused. Though documentation and sharing are critical, these activities have been scattered across dozens of resources, from individual journal supplemental files to, rarely, institutional repositories. Shared search block resources that have been developed previously have never been adopted broadly; successful platforms for documentation and sharing have been local.
We sought to develop a transdisciplinary platform to share search strategies and their documentation.
Method/ Program Description
We collaborated with CABI to develop searchRxiv, a new subject agnostic platform for documenting and sharing search strategies. Initial conversations led to the development of a proposal for a standardised file type for documenting systematic searches, which outlined the key background issues shaping the platform, including the current state of poor reporting quality of search strategies, the need for librarians to receive credit for their intellectual contributions, and a lack of accessibility to search strategies as a contributor to research waste. We proposed creating a platform which would: improve accuracy of search strategy documentation, create citable records for each search strategy, and be openly available to improve accessibility. We established an Advisory Group to provide feedback on the proposal, data elements, and standards required for maximum benefit and reproducibility. 27 data elements were proposed after review by the Advisory Group.
Results/ Evaluation
Using the initial proposal, plus the data elements and standards proposed by the core group and the Advisory Group, CABI developed searchRxiv ( in mid-2021, built on CABI's existing technology infrastructure. searchRxiv enables individuals to create a DOI-stamped record of a search strategy or a search block. Fields captured include title, the search strategy, the date of the search, update dates, the review question, a description, keywords, validation information, whether the search was peer reviewed, links to publications, and database details.

searchRxiv remains in active development as feedback from the user community is received. Long-term, the vision for searchRxiv is to connect it to major search platforms to enable automatic uploading to searchRxiv to improve documentation.
Human Touch (Recommended)

Haddaway-Developing searchRxiv-1223_a.pdf

Haddaway-Developing searchRxiv-1223_b.pptx

ID: 1120 / O 1.1: 3
Oral Presentation
Topics: Information retrieval and evidence syntheses

Reducing systematic review burden using Deduklick: a novel, automated, reliable, and explainable deduplication algorithm

Nikolay Borissov1,2, Quentin Haas1,2, Beatrice Minder3, Doris Kopp-Heim3, Marc von Gernler4, Heidrun Janka4, Douglas Teodoro5,6, Poorya Amini1,2

1Risklick AG, Spin-off University of Bern, Bern, Switzerland; 2CTU Bern, University of Bern, Bern, Switzerland; 3Public Health & Primary Care Library, University Library of Bern, University of Bern, Switzerland; 44 Medical Library, University Library of Bern, University of Bern, Bern, Switzerland; 5University of Applied Sciences and Arts Western Switzerland, Geneva, Switzerland; 6Department of Radiology and Medical Informatics, University of Geneva, Geneva, Switzerland



Identifying and removing reference duplicates when conducting systematic reviews (SRs) remains a major, time-consuming issue for authors who manually check for duplicates using built-in features in citation managers. To address issues related to manual deduplication, we developed an automated, efficient, and rapid artificial intelligence (AI)-based algorithm named Deduklick. Deduklick combines natural language processing (NLP) algorithms with a set of rules created by expert information specialists.


Deduklick’s deduplication uses a multistep algorithm of data normalization, calculated a similarity score, and identified unique and duplicate references based on metadata fields, such as title, authors, journal, DOI, year, issue, volume, and pages. We measured and compared Deduklick’s capacity to accurately detect duplicates with the information specialists’ standard, manual duplicate removal process using EndNote on eight heterogeneous datasets. Using a sensitivity analysis, the efficiency and noise of both methods were manually cross-compared.


Following deduplication and comparing performance measurements, Deduklick achieved an average recall of 99·51%, an average precision of 100·00%, and average F1 score of 99·75%. In contrast, the manual deduplication process achieved an average recall of 88·65%, an average precision of 99·95%, and an average F1 Score of 91·98%. Deduklick achieved equal to higher expert-level performance on duplicate removal. It also preserved a high metadata quality, and drastically diminished the time spent on analysis.


Deduklick represents an efficient, transparent, ergonomic, and time-saving solution for searching and removing duplicates in SRs. Deduklick could therefore simplify SRs production and represent important advantages for scientists, including saving time, increasing accuracy, reducing costs, and contributing to quality SRs.

Human Touch (Recommended)

Automated, Reliable and Explainable Deduplication of trials and publications metadata, part of systematic review process.

Borissov-Reducing systematic review burden using Deduklick-1120_a.pptx

Borissov-Reducing systematic review burden using Deduklick-1120_b.pdf

Borissov-Reducing systematic review burden using Deduklick-1120_c.mp4

ID: 1169 / O 1.1: 4
Oral Presentation
Topics: Information retrieval and evidence syntheses

Use and benefit of citation tracking techniques in evidence synthesis: a scoping review

Christian Appenzeller-Herzog1, Julian Hirt2,3,4, Thomas Nordhausen3, Hannah Ewald1

1University Medical Library, University of Basel, Basel, Switzerland; 2Department of Clinical Research, University Hospital Basel, University of Basel, Basel, Switzerland; 3International Graduate Academy, Institute for Health and Nursing Science, Medical Faculty, Martin Luther University Halle-Wittenberg, Halle (Saale), Germany; 4Institute of Applied Nursing Science, Department of Health, Eastern Switzerland University of Applied Sciences (formerly FHS St.Gallen), St.Gallen, Switzerland

Citation tracking techniques can be used as supplementary search methods in systematic reviews. They aim at collecting cited and citing references from pertinent references that are already known. Evidence-based recommendations on how and when to optimally use citation tracking are needed to guide systematic review workflows.
This scoping review maps the benefit of citation tracking in systematic literature searching for health-related topics (1).
Method/ Program Description
Methodological studies on evidence retrieval by citation tracking in health-related systematic literature searching with no restrictions on study design, language, and publication date were eligible. We searched MEDLINE, Web of Science Core Collection, CINAHL, LLISFT, and LISTA. Additionally, we performed web searching via Google Scholar as well as backward and forward citation tracking of included studies using Google Scholar, Scopus, and Web of Science. Experts were contacted for additional eligible studies. Two reviewers independently assessed reference eligibility. Data extraction and analysis were performed by one reviewer and double-checked by another.
Results/ Evaluation
We identified 47 eligible studies that were published between 1985 and 2021. Studies came mostly from the UK (n=17, 37%) or the US (n=15, 33%). The aims of the studies were to assess (i) benefit or effectiveness of citation tracking (e.g., number and proportion of studies included in a systematic review uniquely via citation tracking), (ii) technical applications of citation tracking (e.g., using Web of Science vs. Google Scholar), and (iii) to compare different citation tracking methods (e.g., direct vs. indirect citation tracking), or a combination thereof. Full scoping review results will be available in early 2022.

The preliminary results show that almost 50 studies published in the last 36 years assessed citation tracking for health-related evidence retrieval. The full results of this scoping review will inform an international online Delphi study aiming at the development of recommendations for citation tracking in systematic literature searching (1). For example, we hope to derive guidance as to when which citation tracking technique is likely to be particularly effective and how it should be conducted.

Human Touch (Recommended)

Biography and Bibliography
1. Hirt J, Nordhausen T, Appenzeller-Herzog C, Ewald H. Using citation tracking for systematic literature searching - study protocol for a scoping review of methodological studies and a Delphi study [version 3; peer review: 2 approved]. F1000Res 2021; 9:1386.

Appenzeller-Herzog-Use and benefit of citation tracking techniques in evidence synthesis-1169_a.pptx

ID: 1175 / O 1.1: 5
Oral Presentation
Topics: Information retrieval and evidence syntheses

Development and validation of a database filter for study size

Sabrina Gunput, Wichor Bramer

Erasmus MC, Netherlands, The

Researchers performing systematic reviews often express the desire to limit the search results to a certain study size: "I want to include only studies of more than 100 patients". While we of course can discuss about the validity of such a request, limiting the search results to match the inclusion criteria can reduce the burden of screening for reviewers.

The aim of our study was to develop a filter in and Medline Ovid to retrieve references above a certain threshhold of sample size. We compared the effectivenss of our filter in development using existing systematic reviews that report using sample size as an inclusion criteria.

Method/ Program Description
Together with researchers who expressed the desire to limit search results to a certain number of patients we constructed preliminary filters which were tested on the spot by evaluating the patient numbers of relevant references that had not been retrieved. If the patient numbers matched the inclusion criteria, the filter was adapted to retrieve the missed articles. After several rounds of improvement of the filter the filter was then tested against existing systematic reviews that used sample size as inclusion criteria but did not limit their search to a sample size.

Results/ Evaluation

The filter that was developed consists mainly of truncated numbers in proximity with words such as patients, cases, adults, females etc and phrase like "n=". The filter can and should be adapted to the research topic by combining these truncated numbers with specific terms for diseases, interventions or body parts of interest such as melanomas, surgeries, eyes or knees. The sensitivity of the filter as evaluated on existing systematic reviews was at least 94%. The references that were not retrieved were older articles that did not include the study size in their abstract.


The study size filter is a good way to limit search results to a certain number of patients. It is not 100% sensitive, but few filters are. Current guidelines for abstract formats advice authors to include in their abstract the number of patients in their research. We therefore expect the sensitivity of the filters only to improve for newer studies. A limitation is that the filters are only available in the interfaces of and Ovid and cannot be translated into PubMed, as the filter uses proximity operators which are not available in PubMed.

Human Touch (Recommended)

With the study size filter the burden of screening for systematic reviews can be greatly reduced.

Gunput-Development and validation of a database filter for study size-1175_a.pptx

Gunput-Development and validation of a database filter for study size-1175_b.pdf

ID: 1174 / O 1.1: 6
Oral Presentation
Topics: Information retrieval and evidence syntheses

Information specialists : guardians of scientific output of their institute

Wichor Bramer

Erasmus MC, Netherlands, The

Researchers from many institutes, both academic and non-academix perform and publish systematic reviews. Many of these institutes have a medical library that offers SR services to their researchers. Sometimes researchers seek assistance but cannot find it, or they fail to seek assistance, yet they will still pursue their review. Thus, many systematic reviews are published without assistance of a medical librarian.
Our aim is to investigate barriers to using assistance from a medical library, and to develop ideas how we can improve the percentage of SRs that are assisted by medical libraries.
We surveyed corresponding authors of systematic reviews from researchers from university hospitals. We asked them whether or not they had used assistance from a search specialist. If they had not used assistance we asked further for the reasons and barriers for not asking for assistance. We surveyed medical librarians from university hospitals about the percentage of SR projects in their institute that they serve. We will investigate the barriers to serving all requests.
Results/ Evaluation
The involvement of information specialists in Systematic review varies substantially. In some organizations 100% of all searches for systematic reviews are developed by information specialists, while in other organizations almost no review is assisted by the library. The most important limiting factor for information specialists is not having enough capacity to do all searches. In many cases researchers are not aware of the option to ask an information specialist for help, or the researchers think they have enough skills to do the searches themselves. The vast majority of information specialists think it is very important that information specialists are involved in systematic reviews prior to publication.

Almost half of the responding systematic review authors have never used a librarian for their systematic reviews. Many are unaware the information specialists can assist them with searches. Many organizations do not have a medical library, or the library does not offer assistence with searches. Systematic review authors are much less convinced of the need of involving an information specialist in systematic reviews that information specilists do.

At our institute we assist 90% of the systematic reviews, thus improving the scientific quality of the publications. However, when we are asked to do peer review of systematic reviews we see SRs from university employees based on inferieor searches that have been developed without the assistance of a librarian. The aim of our research is to inspire medical librarians to become guardians of their organization's systematic review output either by offering peer review of researchers developed searches, or by offering librarian-mediated searches. That way each SR project should be based on a high quality search from the start of the project.

Human Touch (Recommended)

We hope to inspire the audience to reach out to their researchers and offer those working on systematic reviews their help. This will increase the overall quality of published systematic reviews and thus of the treatments that are based on them.

Bramer-Information specialists-1174_a.pptx

Bramer-Information specialists-1174_b.pdf

Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: EAHIL 2022
Conference Software - ConfTool Pro 2.6.144
© 2001–2022 by Dr. H. Weinreich, Hamburg, Germany