New paper: Contribution of Established Stroke Risk Factors to the Burden of Stroke in Young Adults

2017-06-16 09_26_46-Contribution of Established Stroke Risk Factors to the Burden of Stroke in Young2017-06-16 09_25_58-Contribution of Established Stroke Risk Factors to the Burden of Stroke in Young

Just a relative risk is not enough to fully understand the implications of your findings. Sure, if you are an expert in a field, the context of that field will help you to assess the RR. But if ou are not, the context of the numerator and denominator is often lost. There are several ways to work towards that. If you have a question that revolves around group discrimination (i.e. questions of diagnosis or prediction) the RR needs to be understood in relation to other predictors or diagnostic variables. That combination is best assessed through the added discriminatory value such as the AUC improvement or even more fancy methods like reclassification tables and net benefit indices. But if you are interested in are interested in a single factor (e.g. in questions of causality or treatment) a number needed to treat (NNT) or the Population Attributable Fraction can be used.

The PAF has been subject of my publications before, for example in these papers where we use the PAF to provide the context for the different OR of markers of hypercoagulability in the RATIO study / in a systematic review. This paper is a more general text, as it is meant to provide in insight for non epidemiologist what epidemiology can bring to the field of law. Here, the PAF is an interesting measure, as it has relation to the etiological fraction – a number that can be very interesting in tort law. Some of my slides from a law symposium that I attended addresses these questions and that particular Dutch case of tort law.

But the PAF is and remains an epidemiological measure and tells us what fraction of the cases in the population can be attributed to the exposure of interest. You can combine the PAF to a single number (given some assumptions which basically boil down to the idea that the combined factors work on an exact multiplicative scale, both statistically as well as biologically). A 2016 Lancet paper, which made huge impact and increased interest in the concept of the PAF, was the INTERSTROKE paper. It showed that up to 90% of all stroke cases can be attributed to only 10 factors, and all of them modifiable.

We had the question whether this was the same for young stroke patients. After all, the longstanding idea is that young stroke is a different disease from old stroke, where traditional CVD risk factors play a less prominent role. The idea is that more exotic causal mechanisms (e.g. hypercoagulability) play a more prominent role in this age group. Boy, where we wrong. In a dataset which combines data from the SIFAP and GEDA studies, we noticed that the bulk of the cases can be attributed to modifiable risk factors (80% to 4 risk factors). There are some elements with the paper (age effect even within the young study population, subtype effects, definition effects) that i wont go into here. For that you need the read the paper -published in stroke- here, or via my mendeley account. The main work of the work was done by AA and UG. Great job!

new paper in RPTH: Statins and the risk of DVT recurrence

coverI am very happy and honored that i can tell you that our paper “Statin use and risk of recurrent venous thrombosis: results from the MEGA follow-up study” is the very first paper in the new ISTH journal “Research and Practices in Thrombosis and Hemostasis“.

This new journal, for which I serve on the editorial board, is the sister journal of the JTH, but has a couple of focus point that are not present in the JTH. Biggest difference is the open access policy of the RPTH. Next to that, there are a couple of things or subjects that the RPTH welcomes, which are perhaps not so common in traditional journals (e.g. career development articles, educationals, nursing and patient perspectives etc).

Our paper is however a very standard paper, in the sense that it is original research publication regarding the role of statins and the risk of thrombosis recurrence. We answer the question whether statins indeed is linked with a lower risk of recurrence based on observational data, opening up the door to confounding by indication. To counteract, we applied a propensity score, and most important of all, we only used so-called “incident users”. Incident vs prevalent users of statins is a theme that has been a topic on this blog before (for example here and here). The bottom line is this: people who are currently using statins are different from people who are prescribed statins – adherence issues, side effects, or low life expectancy could be reasons for discontinuation.  You need to take this difference between these type of statin users into account, or the protective effect of statins, or any other medication for that matter, might be biassed. In the case of statins and DVT recurrence it can be argued that the risk lowering effect of statins is overestimated. In itself that is not a problem in an observational study. But if the results of this observational study is subsequently used in a sample size calculation for a proper trial, that trial will be underpowered and we might have lost our (expensive and potentially only) shot at really knowing whether or not DVT patients benefit from statins.

RPTH will be launched during ISTH 2017 which will be held in Berlin in a couple of weeks.

New paper: A Prothrombotic Score Based on Genetic Polymorphisms of the Hemostatic System Differs in Patients with IS, MI, or PAOD

My first paper in frontiers of cardiovascular medicine, an open access platform focussed on cardiovascular medicine. This is not a regular case-control study, where the prevelance of a risk factor is compared between an unselected patient group and a reference group from the general population. No, this paper takes patients with cardiovascular disease who are referred for thrombophilia testing. when the different diseases (ischemic stroke vs myocardial infarction / PAOD) are then compared in terms of their thrombophilic propensity, it is clear that these two groups are different. The first culprit to think might be that thrombophilia indeed plays a different role in the etiology of these disease, like we demonstrated in a RATIO publication as well as this systematic review, but it might also be that there is just a different referral pattern. in any case, it indicates that the role of thrombophilia – whether it is causal or physician suspected – is different between the different forms of arterial thrombosis.

Virchow’s triad and lessons on the causes of ischemic stroke

I wrote a blog post for BMC, the publisher of Thrombosis Journal in order to celebrate blood clot awareness month. I took my two favorite subjects, i.e. stroke and coagulation, and I added some history and voila!  The BMC version can be found here.

When I look out of my window from my office at the Charité hospital in the middle of Berlin, I see the old pathology building in which Rudolph Virchow used to work. The building is just as monumental as the legacy of this famous pathologist who gave us what is now known as Virchow’s triad for thrombotic diseases.

In ‘Thrombose und Embolie’, published in 1865, he postulated that the consequences of thrombotic disease can be attributed one of three categories: phenomena of interrupted blood flow, phenomena associated with irritation of the vessel wall and its vicinity and phenomena of blood coagulation. This concept has now been modified to describe the causes of thrombosis and has since been a guiding principle for many thrombosis researchers.

The traditional split in interest between arterial thrombosis researchers, who focus primarily on the vessel wall, and venous thrombosis researchers, who focus more on hypercoagulation, might not be justified. Take ischemic stroke for example. Lesions of the vascular wall are definitely a cause of stroke, but perhaps only in the subset of patient who experience a so called large vessel ischemic stroke. It is also well established that a disturbance of blood flow in atrial fibrillation can cause cardioembolic stroke.

Less well studied, but perhaps not less relevant, is the role of hypercoagulation as a cause of ischemic stroke. It seems that an increased clotting propensity is associated with an increased risk of ischemic stroke, especially in the young in which a third of main causes of the stroke goes undetermined. Perhaps hypercoagulability plays a much more prominent role then we traditionally assume?

But this ‘one case, one cause’ approach takes Virchow’s efforts to classify thrombosis a bit too strictly. Many diseases can be called multi-causal, which means that no single risk factor in itself is sufficient and only a combination of risk factors working in concert cause the disease. This is certainly true for stroke, and translates to the idea that each different stroke subtype might be the result of a different combination of risk factors.

If we combine Virchow’s work with the idea of multi-causality, and the heterogeneity of stroke subtypes, we can reimagine a new version of Virchow’s Triad (figure 1). In this version, the patient groups or even individuals are scored according to the relative contribution of the three classical categories.

From this figure, one can see that some subtypes of ischemic stroke might be more like some forms of venous thrombosis than other forms of stroke, a concept that could bring new ideas for research and perhaps has consequences for stroke treatment and care.

Figure 1. An example of a gradual classification of ischemic stroke and venous thrombosis according to the three elements of Virchow’s triad.

However, recent developments in the field of stroke treatment and care have been focused on the acute treatment of ischemic stroke. Stroke ambulances that can discriminate between hemorrhagic and ischemic stroke -information needed to start thrombolysis in the ambulance-drive the streets of Cleveland, Gothenburg, Edmonton and Berlin. Other major developments are in the field of mechanical thrombectomy, with wonderful results from many studies such as the Dutch MR CLEAN study. Even though these two new approaches save lives and prevent disability in many, they are ‘too late’ in the sense that they are reactive and do not prevent clot formation.

Therefore, in this blood clot awareness month, I hope that stroke and thrombosis researchers join forces and further develop our understanding of the causes of ischemic stroke so that we can Stop The Clot!

Increasing efficiency of preclinical research by group sequential designs: a new paper in PLOS biology

We have another paper published in PLOS Biology. The theme is in the same area as the first paper I published in that journal, which had the wonderful title “where have all the rodents gone”, but this time we did not focus on threats to internal validity, but we explored whether sequential study designs can be useful in preclinical research.

Sequential designs, what are those? It is a family of study designs (perhaps you could call it the “adaptive study size design” family) where one takes a quick peek at the results before the total number of subject is enrolled. But, this peek comes at a cost: it should be taken into account in the statistical analyses, as it has direct consequence for the interpretation of the final result of the experiment. But the bottom line is this: with the information you get half way through can decide to continue with the experiment or to stop because of efficacy or futility reasons. If this sounds familiar to those familiar with interim analyses in clinical trials, it is because it is the sam concept. however, we explored its impact when applied to animal experiments.

Figure from our publication in PLOS Biology describing sequential study designs in or computer simulations

Old wine in new bottles” one might say, and some of the reviewers for this paper published rightfully pointed out that our paper was not novel in terms of showing how sequential designs are more efficient compared to non sequential designs. But there is not where the novelty lies. Up untill now, we have not seen people applying this approach to preclinical research in a formal way. However, our experience is that a lot of preclinical studies are done with some kind of informal sequential aspect. No p<0.05? Just add another mouse/cell culture/synapse/MRI scan to the mix! The problem here is that there is no formal framework in which this is done, leading to cherry picking, p-hacking and other nasty stuff that you can’t grasp from the methods and results section.

Should all preclinical studies from now on half sequential designs? My guess would be NO, and there are two major reasons why. First of all, sequential data analyses have their ideosyncrasies and might not be for everyone. Second, the logistics of sequential study designs are complex, especially if you are affraid to introduce batch effects. We only wanted to show preclinical researchers that the sequential approach has their benefits: the same information with on average less costs. If you translate “costs” into animals the obvious conclusion is: apply sequential designs where you can, and the decrease in animals can “re-invested” in more animals per study to obtain higher power in preclinical research. But I hope that the side effect of this paper (or perhaps its main effect!) will be that the readers just think about their current practices and whether thise involve those ‘informal sequential designs’ that really hurt science.

The paper, this time with aless exotic title, “Increasing efficiency of preclinical research by group sequential designs” can be found on the website of PLOS biology.

Associate editor at BMC Thrombosis Journal


In the week just before Christmas, HtC approached me by asking whether or not I would like to join the editorial board of BMC Thrombosis Journal as an Associate Editor. the aims and scope of the journal, taken from their website:

“Thrombosis Journal  is an open-access journal that publishes original articles on aspects of clinical and basic research, new methodology, case reports and reviews in the areas of thrombosis.Topics of particular interest include the diagnosis of arterial and venous thrombosis, new antithrombotic treatments, new developments in the understanding, diagnosis and treatments of atherosclerotic vessel disease, relations between haemostasis and vascular disease, hypertension, diabetes, immunology and obesity.”

I talked to HtC, someone at BMC, as well as some of my friends and colleagues whether or not this would be a wise thing to do. Here is an overview of the points that came up:

Experience: Thrombosis is the field where I grew up in as a researcher. I know the basics, and have some extensive knowledge on specific parts of the field. But with my move to Germany, I started to focus on stroke, so one might wonder why not use your time to work with a stroke related journal. My answer is that the field of thrombosis is a stroke related field and that my position in both worlds is a good opportunity to learn from both fields. Sure, there will be topics that I have less knowledge off, but here is where an associate editor should rely on expert reviewers and fellow editors.

This new position will also provide me with a bunch of new experiences in itself: for example, sitting on the other side of the table in a peer review process might help me to better understand a rejection of one of my own papers. Bottom line is that I think that I both bring and gain relevant experiences in this new position.

Time: These things cost time. A lot. Especially when you need to learn the skills needed for the job, like me. But learning these skills as an associate editor is an integral part of the science apparatus, and I am sure that the time that I invest will help me develop as a scientist. Also, the time that I need to spend is not necessary the type of time that I currently lack, i.e. writing time. For writing and doing research myself I need decent blocks of time to dive in and focus  — 4+ hours if possible. The time I need to perform my associate editor tasks is more fragmented: find peer reviewers, read their comments and make a final judgement are relative fragmented activities and I am sure that as soon as I get the hang of it I can squeeze those activities within shorter slots of time. Perhaps a nice way to fill those otherwise lost 30 minutes between two meetings?

Open science: Thrombosis journal is part of the Biomed central family. As such, it is an 100% OA journal. It is not that I am an open science fanboy or sceptic, but I am very curious how OA is developing and working with an OA journal will help me to understand what OA can and cannot deliver.

Going over these points, I am convinced that I can contribute to the journal with my experience in the fields of coagulation, stroke and research methodology. Also, I think that the time that it will take to learn the skills needed are an investment that in the end will help me to grow as a researcher. So, I replied HtC with a positive answer. Expect email requesting for a peer review report soon!

New team member!

A couple of weeks ago I announced that my team was looking for a new post-doc. I received many applications, some even from as far as Italy and Spain. Out of this pile of candidates we were able to find an individual candidate who fulfilled all the requirements we had mind and than some. It is great that she will join the team in December. JH has worked in the field of epidemiology for quite some time and is not only experienced in setting up new projects and provide physicians with methodological input on their clinical research projects, but she also has a great interest in the more methodological side of epidemiology. For example, she is co-author/developer of the program DAGitty which can be used to draw causal diagrams. She is also speaker for the working group methodology of the German Society of Epidemiology (dgEpi). Her background in psychology also means that she brings a lot of knowledge on methods that we as a team do not have so far. In short, a great addition to the team. Welcome JH!