new paper: pulmonary dysfunction and CVD outcome in the ELSA study

 This is a special paper to me, as this is a paper that is 100% the product of my team at the CSB.Well, 100%? Not really. This is the first paper from a series of projects where we work with open data, i.e. data collected by others who subsequently shared it. A lot of people talk about open data, and how all the data created should be made available to other researchers, but not a lot of people talk about using that kind of data. For that reason we have picked a couple of data resources to see how easy it is to work with data that is initially not collected by ourselves.

It is hard, as we now have learned. Even though the studies we have focussed on (ELSA study and UK understanding society) have a good description of their data and methods, understanding this takes time and effort. And even after putting in all the time and effort you might still not know all the little details and idiosyncrasies in this data.

A nice example lies in the exposure that we used in this analyses, pulmonary dysfunction. The data for this exposure was captured in several different datasets, in different variables. Reverse engineering a logical and interpretable concept out of these data points was not easy. This is perhaps also true in data that you do collect yourself, but then at least these thoughts are being more or less done before data collection starts and no reverse engineering is needed. new paper: pulmonary dysfunction and CVD outcome in the ELSA study

So we learned a lot. Not only about the role of pulmonary dysfunction as a cause of CVD (hint, it is limited), or about the different sensitivity analyses that we used to check the influence of missing data on the conclusions of our main analyses (hint, limited again) or the need of updating an exposure that progresses over time (hint, relevant), but also about how it is to use data collected by others (hint, useful but not easy).

The paper, with the title “Pulmonary dysfunction and development of different cardiovascular outcomes in the general population.” with IP as the first author can be found here on pubmed or via my mendeley profile.

Advertisements

New paper: Contribution of Established Stroke Risk Factors to the Burden of Stroke in Young Adults

2017-06-16 09_26_46-Contribution of Established Stroke Risk Factors to the Burden of Stroke in Young2017-06-16 09_25_58-Contribution of Established Stroke Risk Factors to the Burden of Stroke in Young

Just a relative risk is not enough to fully understand the implications of your findings. Sure, if you are an expert in a field, the context of that field will help you to assess the RR. But if ou are not, the context of the numerator and denominator is often lost. There are several ways to work towards that. If you have a question that revolves around group discrimination (i.e. questions of diagnosis or prediction) the RR needs to be understood in relation to other predictors or diagnostic variables. That combination is best assessed through the added discriminatory value such as the AUC improvement or even more fancy methods like reclassification tables and net benefit indices. But if you are interested in are interested in a single factor (e.g. in questions of causality or treatment) a number needed to treat (NNT) or the Population Attributable Fraction can be used.

The PAF has been subject of my publications before, for example in these papers where we use the PAF to provide the context for the different OR of markers of hypercoagulability in the RATIO study / in a systematic review. This paper is a more general text, as it is meant to provide in insight for non epidemiologist what epidemiology can bring to the field of law. Here, the PAF is an interesting measure, as it has relation to the etiological fraction – a number that can be very interesting in tort law. Some of my slides from a law symposium that I attended addresses these questions and that particular Dutch case of tort law.

But the PAF is and remains an epidemiological measure and tells us what fraction of the cases in the population can be attributed to the exposure of interest. You can combine the PAF to a single number (given some assumptions which basically boil down to the idea that the combined factors work on an exact multiplicative scale, both statistically as well as biologically). A 2016 Lancet paper, which made huge impact and increased interest in the concept of the PAF, was the INTERSTROKE paper. It showed that up to 90% of all stroke cases can be attributed to only 10 factors, and all of them modifiable.

We had the question whether this was the same for young stroke patients. After all, the longstanding idea is that young stroke is a different disease from old stroke, where traditional CVD risk factors play a less prominent role. The idea is that more exotic causal mechanisms (e.g. hypercoagulability) play a more prominent role in this age group. Boy, where we wrong. In a dataset which combines data from the SIFAP and GEDA studies, we noticed that the bulk of the cases can be attributed to modifiable risk factors (80% to 4 risk factors). There are some elements with the paper (age effect even within the young study population, subtype effects, definition effects) that i wont go into here. For that you need the read the paper -published in stroke- here, or via my mendeley account. The main work of the work was done by AA and UG. Great job!

Advancing prehospital care of stroke patients in Berlin: a new study to see the impact of STEMO on functional outcome

There are strange ambulances driving around in Berlin. They are the so-called STEMO cars, or Stroke Einsatz Mobile, basically driving stroke units. They have the possibility to make a CT scan to rule out bleeds and subsequently start thrombolysis before getting to the hospital. A previous study showed that this descreases time to treatment by ~25 minutes. The question now is whether the patients are indeed better of in terms of functional outcome. For that we are currently running the B_PROUD study of which we recently published the design here.

Virchow’s triad and lessons on the causes of ischemic stroke

I wrote a blog post for BMC, the publisher of Thrombosis Journal in order to celebrate blood clot awareness month. I took my two favorite subjects, i.e. stroke and coagulation, and I added some history and voila!  The BMC version can be found here.

When I look out of my window from my office at the Charité hospital in the middle of Berlin, I see the old pathology building in which Rudolph Virchow used to work. The building is just as monumental as the legacy of this famous pathologist who gave us what is now known as Virchow’s triad for thrombotic diseases.

In ‘Thrombose und Embolie’, published in 1865, he postulated that the consequences of thrombotic disease can be attributed one of three categories: phenomena of interrupted blood flow, phenomena associated with irritation of the vessel wall and its vicinity and phenomena of blood coagulation. This concept has now been modified to describe the causes of thrombosis and has since been a guiding principle for many thrombosis researchers.

The traditional split in interest between arterial thrombosis researchers, who focus primarily on the vessel wall, and venous thrombosis researchers, who focus more on hypercoagulation, might not be justified. Take ischemic stroke for example. Lesions of the vascular wall are definitely a cause of stroke, but perhaps only in the subset of patient who experience a so called large vessel ischemic stroke. It is also well established that a disturbance of blood flow in atrial fibrillation can cause cardioembolic stroke.

Less well studied, but perhaps not less relevant, is the role of hypercoagulation as a cause of ischemic stroke. It seems that an increased clotting propensity is associated with an increased risk of ischemic stroke, especially in the young in which a third of main causes of the stroke goes undetermined. Perhaps hypercoagulability plays a much more prominent role then we traditionally assume?

But this ‘one case, one cause’ approach takes Virchow’s efforts to classify thrombosis a bit too strictly. Many diseases can be called multi-causal, which means that no single risk factor in itself is sufficient and only a combination of risk factors working in concert cause the disease. This is certainly true for stroke, and translates to the idea that each different stroke subtype might be the result of a different combination of risk factors.

If we combine Virchow’s work with the idea of multi-causality, and the heterogeneity of stroke subtypes, we can reimagine a new version of Virchow’s Triad (figure 1). In this version, the patient groups or even individuals are scored according to the relative contribution of the three classical categories.

From this figure, one can see that some subtypes of ischemic stroke might be more like some forms of venous thrombosis than other forms of stroke, a concept that could bring new ideas for research and perhaps has consequences for stroke treatment and care.

Figure 1. An example of a gradual classification of ischemic stroke and venous thrombosis according to the three elements of Virchow’s triad.

However, recent developments in the field of stroke treatment and care have been focused on the acute treatment of ischemic stroke. Stroke ambulances that can discriminate between hemorrhagic and ischemic stroke -information needed to start thrombolysis in the ambulance-drive the streets of Cleveland, Gothenburg, Edmonton and Berlin. Other major developments are in the field of mechanical thrombectomy, with wonderful results from many studies such as the Dutch MR CLEAN study. Even though these two new approaches save lives and prevent disability in many, they are ‘too late’ in the sense that they are reactive and do not prevent clot formation.

Therefore, in this blood clot awareness month, I hope that stroke and thrombosis researchers join forces and further develop our understanding of the causes of ischemic stroke so that we can Stop The Clot!

Cardiovascular events after ischemic stroke in young adults (results from the HYSR study)

2016-05-01 21_39_40-Cardiovascular events after ischemic stroke in young adults

The collaboration with the group in finland has turned into a nice new publication, with the title

“Cardiovascular events after ischemic stroke in young adults”

this work, with data from Finland was primarily done by KA and JP. KA came to Berlin to learn some epidemiology with the aid of the Virchow scholarship, so that is where we came in. It was great to have KA to be part of the team, and even better to have been working on their great data.

Now onto the results of the paper: like in the results of the RATIO follow-up study, the risk of recurrent young stroke remained present for a long-term time after the stroke in this analysis of the Helsinki Young Stroke Registry. But unlike the RATIO paper, this data had more information on their patients, for example the TOAST criteria. this means that we were able to identify that the group with a LAA had a very high risk of recurrence.

The paper can be found on the website of Neurology, or via my mendeley profile.

Pregnancy loss and risk of ischaemic stroke and myocardial infarction

2016-04-08 13_36_29-Posteingang - bob.siegerink@charite.de - Outlook

Together with colleagues I worked on a paper on relationship between pregnancy, its complications and stroke and myocardial infarction in young women, which just appeared online on the BJH website.

The article, which analyses data from the RATIO study, concludes that only if you have multiple pregnancy losses, your risk of stroke is increased (OR 2.4) compared to those who never experienced a pregnancy loss. The work was mainly done by AM, and is a good example of international collaborations where we benefitted from the expertise of all team members.

The article, with the full title “Pregnancy loss and risk of ischaemic stroke and myocardial infarction” can be found via PubMed, or via my personal Mendeley page.

Statins and risk of poststroke hemorrhagic complications

2016-03-28 13_00_38-Statins and risk of poststroke hemorrhagic complicationsEaster brought another publication, this time with the title

“Statins and risk of poststroke hemorrhagic complications”

I am very pleased with this paper as it demonstrates two important aspects of my job. First, I was able to share my thought on comparing current users vs never users. As has been argued before (e.g. by the group of Hérnan) and also articulated in a letter to the editor I wrote with colleagues from Leiden, such a comparison brings forth an inherent survival bias: you are comparing never users (i.e. those without indication) vs current users (those who have the indication, can handle the side-effects of the medication, and stay alive long enough to be enrolled into the study as users). This matter is of course only relevant if you want to test the effect of statins, not if you are interested in the mere predictive value of being a statin user.

The second thing about this paper is the way we were able to use data from the VISTA collaboration, which is a large amount of data pooled from previous stroke studies (RCT and observational). I believe such ways of sharing data brings forward science. Should all data be shared online for all to use? I do am not sure of that, but the easy access model of the VISTA collaboration (which includes data maintenance and harmonization etc) is certainly appealing.

The paper can be found here, and on my mendeley profile.

 

– update 1.5.2016: this paper was topic of a comment in the @greenjournal. See also their website

update 19.5.2016: this project also led to first author JS to be awarded with the young researcher award of the ESOC2016.