Watch Antonia Geck and Cora Klockenbusch ask attendees about their impression of ECSJ2017.
For the full conference coverage, head over to ESCJ’s website, here.
Watch Antonia Geck and Cora Klockenbusch ask attendees about their impression of ECSJ2017.
For the full conference coverage, head over to ESCJ’s website, here.
Hot young girls in high heels. Powdered make-up exploding across bubbling and steamy apparatus. Equations written in lipstick. Sounds like a normal day in the lab for most women scientists. Except it isn’t. The scenes are, of course, snippets from the roundly and soundly derided ‘Science: It’s a girl thing’ video released to shock and awe–the bad kind–in 2012.
Born of a well-meaning but inherently flawed campaign from the European Commission, it has been criticised and parodied to the point that further condemnation for reinforcing stereotypes would be like pulling a girl’s hair and stealing her chocolate. Marie Curie’s appearance may arguably not be as attractive as a catwalk models, but if you could find visual props to picture a beautiful mind, she would be a shining star.
To be fair to the EC, it’s not hard to see why they thought they had to do something: a She Figures 2012 report points out that the share of women graduating at PhD level now stands at 46%, but women account for only 33% of researchers in the EU. And while 59% of EU graduate students in 2010 were female, only 20% of EU senior academicians were women.
Is the image of women scientists to blame for the lack of popularity of science studies? It is clear the problems begin before university and academia. The UK’s Institute of Physics has found that for the last two decades in the UK only 20% of physics students past age 16 have been girls, despite about equal success for boys and girls in physics and science exams leading to that point.
How much could changing the image of female scientists do to solve the two problems that persist? Namely, boosting girls’ involvement in science from an early age. And removing the barriers to top positions for female scientists when they get there.
A classic remedy to anyone with an image problem would be to try and alter that image through advertising campaigns. But do these get-girls-to-do-science campaigns really work? “I don’t know,” says Claudine Hermann, Vice President of the European Platform for Women Scientists who in 1992 was the first woman to be appointed as a professor at military engineering school École Polytechnique, in Paris, France.
The trouble is that such campaign often fails to convince. Hermann has spent the past 20 years immersed in the challenge and says there have been plenty of campaigns to convince girls—and boys—to go into science “But they have not been very efficient,” she says, “You cannot know what had happened if campaign had not existed.” Sounds like the perfect area to test a policy change through a randomised trial. Others concur that advertising has obvious limitations. “I don’t think one video will make any difference,” says sociologist Lousie Archer from King’s College London, who, like most, is not a great fan of the ‘Science: It’s a girl thing’ video. But she says she could see what they were trying to do.
Rather, the image problem may just be the tip of the iceberg, where deeply engrained cultural and social perceptions are slow to evolve. “Our research shows the masculine image of science is an issue and ‘girly’ girls are much less likely to aspire to science careers than ‘non-girly’ girls, even though they both like science at school,” she explains. This suggests that young girls displaying an interest for science may fear that they will be regarded as uncool by boys. “Analysis shows single sex schools are most effective way of getting girls to study physics A level,” notes Archer. “Our surveys of over 9000 primary and 5600 secondary pupils show that the ‘brainy’ image of science is also a key part of the problem and can be particularly off putting for girls,” says Archer. “And social class is as much an issue as gender.”
Confirmation of the overwhelming impact of culture and social influences can be found in Eastern Europe, after the Second World War. The communist ideology dictated equal ability and opportunity between the sexes as society forged on as one powered by engineering and science. Indeed, the EU expert group Enwise (Enlarge Women in Science to East) published a reportconcluding that the availability of childcare facilities and state support for working mothers led to the a significant proportion of well qualified women in high profile roles, particularly in science.
Unfortunately, as the Communist Bloc unravelled so did funding, infrastructure and many of the benefits, although still leaving a higher proportion of women researchers in science than in the West today. This was particularly true in countries with smaller populations, who face challenges such as being frozen out of more competitive, high-cost research programmes. As Hermann notes, “it’s complex.” A Czech report from 2008, provides an update of the recent status of women in science in the Czech republic, Hungary, Poland, Slovakia and Slovenia.
The issue is not just about stereotypes, however, and could also be linked to a widespread lack of knowledge of the high transferability of science qualifications. “Most young people don’t realise science qualifications are useful for a wide range of jobs both in and out of science,” says Archer.
Perhaps, the lack of role model is also to blame. Hermann also says that under-representation of women scientists in the media is also a problem. From TV appearances to museum exhibits, they often fail to recognise the role women play in science.
On the positive side, women already in science today stand a better chance to climb the career ladder than before. Hermann cites programmes in Switzerland and Ireland that led to more women professors. “If there is a state policy and real funds things can change,” she says. “But if you just speak there will be very slow evolution. You need political will.”
And for political action you need increased awareness that there is a problem, which has gained much more prominence according to physicist Athene Donald from the University of Cambridge, UK. She cites the Athena Swan Charter for women in science, applied for and awarded to universities, as an action that “has certainly raised everyone’s awareness and also the stakes.” A 2009 winner of The L’Oréal-UNESCO Awards for Women in Science and a noted blogger on the topic, Donald says actions are needed too. “This isn’t about generational change. This action will be more important at later career stages, university and beyond.” Actions that might work right now include not writing ‘Science: It’s a girl thing’ in lipstick on the EC revamped website.
Researchers put model predictions of radiation to the test ahead of future manned missions to Mars.
Cold, dry, airless – Mars doesn’t make the most comfortable environment for human exploration. But what makes a manned mission to the Red Planet truly dangerous is its radiation, which is thought to be more than 500 times more potent than here on Earth.
Now, a team based in Germany and the US has made an important step towards predicting when, where and with what strength this radiation will strike. Their work, which has just been published in Life Sciences in Space Research, compares theoretical predictions of different models with actual observations for the first time. This work could one day be used to mitigate the risk to Mars explorers of radiation sickness and cancer.
“Using different models and comparing them to available data allows us to better understand the weaknesses and strengths in those models, and how we can apply them to extend our knowledge beyond the measurements,” explains lead author Daniel Matthiä of the German Aerospace Center in Cologne. “We can use the models to determine hypothetical scenarios, for example, the radiation exposure in shelters, habitats and underground.”
Radiation on Mars is far from consistent. Most of it consists of cosmic rays – high-energy particles shooting from outer space – and these vary with the fluctuating strength of the Sun’s protective magnetic field. But just being able to forecast general levels of cosmic rays isn’t enough, as different high-energy particles have different effects on human physiology, and the geography of Mars means that some places are more exposed than others.
There is already a handful of computer models that can predict, in high detail, the changing radiation field on Mars generated by cosmic rays. However, it is not known how accurate these predictions are.
Matthiä and colleagues have begun to find out, by comparing model predictions with data taken from the Radiation Assessment Detector (RAD) instrument aboard NASA’s Curiosity rover on Mars since 2012. This effort was part of a “Blind Challenge” Model Comparison Workshop organized by Matthiä, Don Hassler at Southwest Reseach Institute in Boulder, Colorado (Principal Investigator of the RAD investigation) and the RAD team. The RAD is supported by NASA and DLR to make exactly these kind of measurements to help improve astronaut safety on future human missions to Mars. Building on a preliminary comparison last year, the researchers asked several modelling groups to predict the radiation environment on the surface of Mars for a two-month period, without seeing Curiosity’s results beforehand. “This is harder than it sounds,” says Matthiä.
Although the researchers are unable to quantify the accuracy of the model predictions yet, Matthiä says they were surprised in some cases by how much the models disagreed with each other and with the observational data. But this will help to improve the model predictions, he says, and ultimately provide more confidence when planning manned missions to Mars.
“Understanding, mitigating and managing the radiation environment for astronauts on a manned mission to Mars is a challenging, but not unsurmountable, problem,” Matthiä adds. “The more we understand about the environment, its variability, and its effect on humans, the safer our astronauts will be.”
Comparisons of model predictions and RAD data are not the only way to study the health effects of space radiation during a manned Mars mission, however. Last year, in another paper published in Life Sciences in Space Research, scientists reported upgrades at NASA’s Space Radiation Laboratory (NSRL) at the Brookhaven National Laboratory in Upton, New York, enabling the effects of cosmic rays to be simulated experimentally with greater precision here on Earth.
D. Matthiä et al.: “The radiation environment on the surface of Mars – Summary of model calculations and comparison to RAD data,” Life Sciences in Space Research (2017)
Hacking solutions to science problems are springing up everywhere. They attempt to remove bureaucracy and streamline research. But how many of these initiatives are coming from the science publishing industry? There is currently no TripAdvisor to the best journal for submission, no Deliveroo for laboratory reagent delivery. How about a decentralised peer-review based on the blockchain certification principle? Today, the social media networks for scientists—the likes of ResearchGate, Academia.edu and Mendeley—have only started a timid foray into what the future of scholarly publishing could look like.
This topic was debated in front of a room packed with science publishing executives at the STM conference, on 18th October 2016, on the eve of the Frankfurt Book Fair. Earlier that day, Brian Nosek, executive director at the Centre for Open Science, Charlottesville, Virginia, USA, gave a caveat about any future changes. He primarily saw the need to change the way incentives for scientist work so that, ultimately, research itself changes rather than technology platforms imposing change.
Yet, the key to adapting is “down to the pace of experiment,” said Phill Jones, head of publisher outreach, at Digital Science, London, UK, which provides technology solutions to the industry. Jones advocates doing lots and lots of experiments to find solutions to better serve the scientific community.
Indeed, “rapid evolution based on observed improvement is better than disruption for the sake of disruption,” agreed John Connolly, chief product officer at Springer Nature, London, UK.
Adopting an attitude that embrace these experiments, “is the biggest change that we [the scholarly publishing industry] need to embrace,” Jones concluded.
To do so, “we need publishers to be a lot less cautious,” noted Richard Padley, Chairman Semantico, London, UK, providing technology solutions to science publishers. “It is a cultural thing, publishers need to empower their organisation to use technology from the top down.”
So are the lives of scientists about to be changed? Arguably, yes. Resistance from proponents of the status quo may still arise. It may depend on the pace at which science publisher turn into technology service industry. The truth is “users want to see tools that are much more user-centred and less centred around publishers,” argued Connolly. However, “if you ask a scientists what they wanted [in the past], they would have said high impact factors articles,” said Phil Jones. “They thought this is what they wanted because there was no alternative,” Jones added, whereas: “they wanted to have higher impact of their research and have greater reach.”
Clearly, “if you are optimistic about publishers, there is a job for publishers, to synthesise knowledge, to see the relevant content,” said Connolly.
This means taking quite a lot of adjustment to those who pay for content. A download is not a marker of whether you have passed on that synthesised knowledge!
Increasingly digital breadcrumbs are making it possible for others to track our every move. To illustrate what is at stake here, we need to travel back in time. In the pre-computer era, the ability to remain incognito made for great drama in black and white movies. It also opened the door to the perfect crime, without much risk that the perpetrator would get caught. Yet, when it comes to crime, invading peoples’ privacy could be justified, some argue, for the sake of the greater good and to preserve a stable society.
But now anybody can become the object of intense online and digital scrutiny, regardless of whether they are guilty of some nefarious crime or not. And there is distinct possibility that digital natives may, in the not so distant future, take for granted that their every move and decision is being traced without any objection.
It is not clear which is more worrying: that future generations might not even question that they are under constant digital scrutiny. Or that it is our generation that is allowing technology to further develop without the safety nets that could secure our privacy now and for the future; this could leave the next generation without any hope of the privacy we once took for granted.
Health offers an insightful comparison. It may appear paradoxical, but our society appears much more concerned about preserving our physical health than the health of our digital anonymity. Indeed, new drugs are subjected—rightly so—to a very intense regulatory process before being approved. But new technology and the way inherent data is handled is nowhere near as closely scrutinised. It simply creeps up on us, unchecked.
Despite protests from regulators that existing data privacy laws are sufficient, greater regulatory oversight would invariably impact the way data collection is operated. Take the case of data used for research, for example. Experience has shown that even in countries where transparency is highly valued, such as Denmark, there have been deficiencies in getting consent for the use of sensitive personal health data in research, which recently created uproar. By contrast, the current EU regulatory debate surrounding the new Data Protection Directive has the research community up in arms, for fear that too much data regulation would greatly disrupt the course of research.
As for our digital crumbs, it has therefore become urgent to consider how best this data may be managed, at the dawn of the Internet of Things. Striking the right balance between finding applications with societal relevance and preserving people’s privacy remains a perilous exercise.
Do you believe digital natives are unlikely to be as concerned about their privacy?
Should we allow technology to further develop without implementing the necessary privacy safety nets?
Data protection regulation GDPR, includes exemptions to allow research on anonymised data. In this exclusive interview with Shawn Jensen, CEO of data privacy company Profila, Sabine Louët finds out about the implications of the new GDPR regulations for citizens and for researchers. The regulation was adopted on 27th April 2016 and is due to enter into force on 25th May 2018. In essence, it decentralises part of the data protection governance towards data controllers and people in charge of processing the data.
As part of the interview, Jensen explains the exemptions that have been bestowed upon certain activities, such as research, so that scientists can continue to use anonymised data for their work while having to ensure the privacy of the data required by the law.
In fact, the regulations are designed to preserve the delicate balance between the need to protect the rights of data subjects in a digitalised and globalised world and yet making it possible to processing of personal data for scientific research, as explained in a recent study by Gauthier Chassang, from the French National Health Research Institute INSERM. The study author concludes:
“While the GDPR adopts new specific provisions to ensure adapted data protection in research, the field remains widely regulated at national level, in particular, regarding the application of research participants’ rights, which some could regret. ”
However, Chassang pursues,
“the GDPR has the merit to set up clearer rules that will positively serve the research practices notably regarding consent, regarding the rules for reusing personal data for another purpose, assessing the risks of data processing …” In addition, he continues, “for the first time, the GDPR refers to the respect of ethical standards as being part of the lawfulness of the processing in research” and “opens new possibilities for going ahead in the structuring of data sharing in scientific research with measures encouraging self-regulation development.”
Read the original article, here…
Discussion on privacy were top of the agenda at one of the biggest technology trade fairs in Europe, the CeBIT 2017 in Hannover, Germany. Indeed as social media are full of the many aspects of the lives of those who share their views with as little as pressing a ‘like’ button on Facebook.
Invited speaker at CeBIT 2017, Michal Kosinski, who is an assistant professor of organisational behaviour at Stanford graduate school of business, California, USA, shares update on his latest work about the ability to predict future behaviour from psychological traits inferred from the trail made of the many digital crumbs, including pictures we share over the internet.
His latest work has huge implications for privacy. He believes human faces—available from pictures found on social media networks–can be used as a proxi for people’s hormonal levels, genes, developmental history and culture. “It is pretty creepy that our face gives up so much personal information.” He adds: “I would argue sometimes it is worth giving up some privacy in return for a longer life and better health.”
In this context, the regulators don’t work in a vacuum. But regulators cannot guarantee absolute privacy. He explains that it is an illusion for people to strive to retain control over their own data. “The sooner as a society and policy makers, we stop worrying about winning some battles in a privacy war, and the sooner we accept, ‘ok we’ve lost this war,’ and we move towards how to organising society and culture, and technology and law, in such a way that we make the post-privacy world a habitable place, the better for everyone.”
In this exclusive podcast, Kosinski also discusses the constant struggle between top-down governance and bottom-up self-organisation, which leads to a constant trade off in terms of privacy in oursociety. He gives an insightful example, The likes of Facebook with their algorithms would be uniquely placed to match people with the right job, or detect suicide before they happen. However, this possibility raises questions concerning the level of invasion of people’s privacy, which is not socially acceptable, even if it could solve some of our society’s problems.
Finally, Kosinski gives another example where people’s privacy has been invaded for the purpose of changing people’s behaviour. Specifically, he refers to intervention by car insurance industry which has added sensors in cars to monitor the drivers’ behaviour, thus breaching their privacy in exchange for lower premium.
Open Access (OA) continues to be the subject of discussion in the scientific community, as do debates about the need for greater levels of open access. However, the reality on the ground is not as clear cut and the adoption rate of OA is not as quick as its promoters would like it to be. At the recent STM Association conference held on the 10th October 2017 in Frankfurt, I presented the findings of a study by independent publishing consultancy Delta Think, Philadelphia, USA, about the size of the open access market. The numbers help unearth recent trends and the dynamics of the OA market, giving a mixed picture. Although open access is established and growing faster than the underlying scholarly publishing market, OA’s value forms a small segment of the total, and it is only slowly taking share. With funders’ showing mixed approaches to backing OA, it might be that individual scientists have a greater role to play to effect change.
New figures about the size and growth of the current Open access market have been published recently by Delta Think’s Open Access Data & Analytics Tool, which combines and cross-references the many and varied sources of information about open access to provide data and insights about the significant number of organisations and activities in the OA market recently released. They give a sense of how Open access is faring, and what the future has in store for its adoption.
In 2015, the global Open access market is estimated to have been worth around EUR €330m (USD $390m), and grew to around €400m ($470m) in 2016. This growth rate is set to continue, meaning that the market is in course to be worth half a billion dollars globally going into 2018.
To put the sizing numbers into context, the growth rate in OA revenues is much higher than the growth in the underlying scholarly publishing market, which typically grows at a few percent –low to mid-single digits–per year.
However, the share of the OA market is low compared with its share of output. Just over 20% of all articles were published as Open access in 2016. Meanwhile, they accounted for between 4% and 9% of total journal publishing market value, depending on how the market is defined. The OA numbers cover Gold Open access articles publishing in the year, being defined as articles for which publication fees are paid to make content immediately available, and excluding such things as articles deposited in repositories under an embargo period or articles simply made publicly accessible.
The degree to which Open access is taking market share is slow. Taking the 2002 Budapest OA Initiative as a nominal starting point for market development, the data suggest that it has taken over 17 years for Open access to comprise one-fifth of article output, and, at best, one-tenth of market value.
The key driver of changes in the OA market remains funders. When choosing which journals to publish in, researchers continue to value factors such as dissemination to the right audience, quality of peer review, and publisher brand over whether their work will be made OA. Numerous studies have shown this, and suggest similar attitudes towards both journals and monographs.
Movement towards OA therefore happens where funders’ impetus overcomes researchers’ inertia. Funders’ appetites for OA in turn vary by territory, so the outlook for Open access growth of market share remains mixed. Funders in the EU have the strongest mandates, but many in the Far East, for example, incentivise publication based on Journal Impact Factor, regardless of access model, as they seek to enhance reputations and advance careers.
The relatively low value of open access content compared with its share of output poses interesting questions about the sustainability of the open access publishing model. To some, the data suggest that open access is cost effective and could lower systemic costs of publishing. By contrast, others suggest that we need to be realistic about systemic costs and global funding flows.
Further, although open access is clearly entrenched in the market, at current rates of change, it will be decades before open access articles and monographs form the majority of scholarly output. Opinions vary as to whether the transition to Open access is frustratingly slow or reassuringly measured.
The current data suggest that the discussion and debate around open access will continue as they have been. For the average researcher, it is therefore business as usual and so it might be that individual scientists have a greater role to play now to shift the balance, regardless of whether funders nudge them in the OA direction.
Daniel is Director of Data & Analytics for Delta Think, a consultancy specialising in strategy, market research, technology assessment, and analytics supporting scholarly communications organisations; helping them successfully manage change.