Adventures in screening systematic reviews

I’m screening full text systematic reviews for an umbrella review and I have seen some atrocious examples. If you follow me on xTwitter, you will have come across some of my rants (I also say hooray when authors get things right). In this post, I want to review some of the common errors I’ve come across in order to help others who are on the review writing train.

  1. Searching in just one database
  2. Listing databases used
  3. The search strategy
  4. PRISMA
  5. Not consulting a librarian

Searching in just one database. A few reviews stated that only one database was searched. One had the audacity to state that no duplicates were found. If only one database is used, this makes it a literature review, NOT a systematic review.

Listing databases used. A common issue is listing platforms and publishers as databases. Another is not specifying what database is being used when a collection of databases is being used. A review I looked at today listed Cochrane and Elsevier in the list of databases. Cochrane is an international organisation and Elsevier is the world’s largest STEM publisher. When you want to indicate that you used databases in the Cochrane Library, specify whether it was the Cochrane Database of Systematic Reviews (CDSR) or CENTRAL. It is important to list what platform was used to search. This is because databases are available on a variety of different platforms and informing you which platform was used increases reproducibility. A few others have listed Web of Science (WoS) as a database. WoS is a platform that contains many databases, including Medline. One review stated they searched PubMed, Medline and WoS (depending on their institution’s subscription, they probably searched Medline). They searched Medline 3 times!

The search strategy. Medline alone contains over 36 million citations. When your citations from all your database searching doesn’t even reach 500 (or is even less), something is seriously wrong with your search strategy. Other issues include very badly crafted search strategies (and I have seen some gawd awful ones, let me tell you). When someone is doing a critical appraisal of a systematic review, one of the first questions is about the search strategy. Is it robust enough to make it worthwhile continuing the appraisal? Many reviews have fallen at the first hurdle. Another strange issue is the number of people running the search strategies. One review stated all four authors ran the search strategy in all databases. Why? They would all get the same results (hopefully)!! Only one person has to run the searches and download all citations.

A mantra to keep in mind: the search strategy is the foundation of the systematic review. A bad foundation undermines the review.

PRISMA. PRISMA is a reporting guideline as per its full title: Preferred Reporting Items for Systematic Reviews and Meta-analysis. It is NOT a handbook or guideline for writing/conducting the review. When referring to PRISMA, please state that it was reported using/according to PRISMA.

Not consulting a librarian. All of these problems could have been avoided if the authors included a librarian on the team. If your institution or organisation has librarians on staff, please consult them! They can be on the sidelines reviewing your strategy and providing advice, right up to full authorship with screening and commenting on draft versions before journal submission. Do you want your review to pass the first test in critical appraisal? Get a librarian on board!!

Joining a clinical trial

I have read countless reports of clinical trials but I had never been in one. Until I got an invitation to join one. Last month, I got a letter from the hospital I attend (not the one I work at) along with an appointment letter. I thought it would be interesting to have an inside view of a trial and I contacted the lead researcher. After a phone call assessing eligibility, an appointment was made for me to visit the main hospital for an interview, blood test and covid booster. This took about an hour. What is this trial? It is the BOOST-IC trial. It aims to determine whether an extra covid booster vaccination gives extra protection to people with solid organ transplants (that’s me), blood cancers (like leukaemia), or AIDS. Even though my kidney is now failing (last bloods indicated 12% function), I was still eligible.

After the initial clinic visit, I had to fill out an online form about vaccine side effects, which included taking my temperature every day for 7 days. Then nothing doing until the following month when I will go to one of the hospital pathology clinics to have a blood test. Then some more weeks will go by and I will have another blood test.

At the first appointment, I was asked if I would like a copy of the results when they become available. I demurred, saying that I will look it up and read the RCT. I asked which journal they were considering and they hadn’t decided on one yet. They have a wide choice due to the population they are studying. I also mentioned JANE (Journal Author Name Estimator) and that they could present at numerous conferences (AIDS, various cancer conferences, vaccinations/immunology and renal).

It will be years until I get to read it though!

If you are invited to join a trial or come across one that is recruiting that you are eligible for, I encourage you to join. It’s an interesting project to be involved in.

Does ivermectin cause male infertility?

There has been a paper circulating on Twitter that people have used to back up claims that ivermectin results in male sterility. Is this true? Let’s look at the paper in question: Effect of ivermectin on male fertility and its interaction with P-glycoprotein inhibitor (verapamil) in rats. Eviron Toxicol Pharmacol 2008 Sep;26(2):206-11. doi: 10.1016/j.etap.2008.03.011.Epub 2008 Mar 29. https://pubmed.ncbi.nlm.nih.gov/21783912/

There are a number of steps to follow in order to assess value. I will outline them below with comments.

Is this article from a journal published by a reputable publisher? If you look at the article record on PubMed, on the right you will see full text links. There is a button for Elsevier. Elsevier is a giant in science publishing. Good so far!

Next up – let’s look at the paper itself. Yes that’s right, let’s look at the actual paper – NOT the abstract. You can’t make a judgement on abstract alone. Abstracts can give false impressions and leave out important information. Sometimes abstracts are incorrect. So, if you want to make a decision using this paper, get the full text and read it. I can’t emphasise this enough. If you want to look around for papers in order to make informed decisions, this is great. This is what healthcare staff want you to do! But you have to do the work. It is possible for you to get this paper for just a small fee. Go to your local public library and request it. This is called interlibrary loan.

Now you have the paper, what next? Well, we are going to do what is called a critical appraisal. It’s a process that anyone can do that helps unpick and understand the contents. It is basically a checklist for things to look out for and think about. This is an animal study (I know humans are animals, but still ..) so what we have to do is find a checklist for animal studies. These two webpages are from a paper[1] proposing a set of questions to answer [Table 1 Table 2]. I haven’t used this one yet as all the checklists I’ve used have been for human studies. So let’s begin!

1] “If the study was conducted in a manner that suggests little internal bias, will it be useful for the ‘next step’ because the population is relevant to ‘the next step?’” YES/N The paper authors state that there is a presumption that results of laboratory animal studies can be extrapolated to humans but this is not the case. The next step they say is to take it to other animal populations. So the question is – is it worthwhile going forward with this study and conduct another experiment in a different animal population?

This is a small early laboratory animal study in rats. Can I use it to make a health decision for myself or a family member? No. Let’s go back to the question we originally put: does ivermectin cause infertility in male humans? This paper will not answer that question. It is only asking if it causes infertility in male rats when used with verapamil. To answer this question, you need a paper that looks at side effects of ivermectin – a study about drug safety.

Safety of high-dose ivermectin: a systematic review and meta-analysis. [2] This title looks like a paper that could answer our question. There are lots of good starting points: the journal is indexed in Medline and it is a systematic review. Systematic reviews take studies found using a broad search strategy for a narrow question and analyse them together. Sometimes the studies the authors found are so similar, the individual results can be combined for statistical analysis – this is called meta-analysis. Systematic reviews still have to be critically appraised though and one tool I use is CASP. There are lots of hints as to where to look for the information in a paper and what to look for. If you have the full text, let’s do an appraisal.

  1. Is there a clearly focussed question? Yes – look at the last sentence in the Introduction
  2. Did the authors look for the right type of papers? – Ideally, RCTs should be used but in this case, it could be unethical. All types of studies were included that met the inclusion criteria.
  3. Do you think that all important and relevant studies were included? Look at the Methods section. Relevant databases were searched, all languages included, reference papers sought and authors contacted. There is no ready access to the search strategy, which is a shame, but let’s say Yes and continue.
  4. Did the authors do enough to assess the quality of the studies? There are two quality assessment sections – one for the meta-analysis which used only RCTs (a big plus) and the another for the other study types.
  5. If the results have been combined, was it reasonable to do so? This means, was there a meta-analysis? Yes, there were two for different drug doses. Was it reasonable to do so? Yes. The authors state that there was low heterogeneity, which means the results were similar enough to be able to be combined.
  6. What are the results? Adverse events were mild or moderate and not connected to the dosage but to the underlying condition the drug was being used to treat. There is a table of adverse effects and infertility is not there. The adverse effects listed are: eye, brain, skin and other (things like swelling and back pain),
  7. How precise are the results? Looking at confidence intervals gives you an idea about how accurate the estimate is. The smaller the range, the better. The researchers are confident that if the analysis was repeated, similar results would appear. There are a range here – some are larger than others, depending on the condition.
  8. Can the results be used in the local setting? This question is for health professionals. Are the populations in the study similar to mine?
  9. Where all the important outcomes considered? For our purposes, yes! We want to know about infertility as a side effect.
  10. Are the benefits worth the harm and costs? Ivermectin is a low cost drug and easily obtainable. But our question wasn’t about using it for treating parasitical infection.

Take-away: Ivermectin does NOT cause infertility in human males.

[1] Annette M. O’Connor, Jan M. Sargeant, Critical Appraisal of Studies Using Laboratory Animal Models, ILAR Journal, Volume 55, Issue 3, 2014, Pages 405–417, https://doi.org/10.1093/ilar/ilu038

[2] Navarro, Miriam, et al. “Safety of high-dose ivermectin: a systematic review and meta-analysis.” Journal of Antimicrobial Chemotherapy 75.4 (2020): 827-834. https://www.icpcovid.com/sites/default/files/2020-04/Safety%20of%20higher%20doses%20of%20Ivermectin%20JAC%202020.pdf