Posts by SciencePOD Editor

Welcome to the SciencePOD blog!! Find out what is happening at the interface of digital publishing and science communication. Making sense of science and innovation through clear and compelling message aimed at wider targeted audiences is both challenging and exciting. Not only does it have implications for scientists and innovators themselves, but also for society at large. Making sure we get the complex idea buried in the scientific literature and technical documentation right is crucial. SciencePOD brings you the peace of mind of getting access to the best professional teams every time to deliver quality content. In this blog, we bring you the latest topics related to content marketing, digital media, science publishing, open science and open data, science in society.

Digital media changes how we talk

Photo Credit: bandt.com.au

A new linguistic study analyzes how technology transforms our communication. The current change is unique in its speed—and may have far-reaching cultural and educational consequences in the long run.

 

The medium we use affects the message we want to convey. That is why you probably would not end a romantic relationship with a text message, and it would be a bit strange to send a handwritten letter to your boss to tell her you are taking a day off sick.

Yet it doesn’t end there. The influence that the medium has over the messages we send has spread well beyond the confines of a few lines of text, and it is having a profound impact on the way we communicate as a society.

The third millennium started in a period of curiosity and excitement about the many new technologies entering our everyday lives. Mobile phones were becoming increasingly widespread, and SMS messages offered an entirely new form of communication.

“Digital media has become so crucial to our communication that it has created a new type of language.”

One person following this evolution closely is Ágnes Veszelszki, Associate Professor in communication and Hungarian linguistics at the Corvinus University in Budapest, Hungary. Digital media has become so crucial to our communication that it has created a new type of language. Veszelszki calls this “digilect”, and she defines and analyzes it in a new book that goes by the same title and looks at just how much our language has been changed by digital media.

“Digilect is the language variety (type) of digital media, which is typically used during communication taking place on computers or other digital devices,” Veszelszki explains. “It has many special characteristics in terms of form, spelling, grammar, and style.” For her book, Veszelszki indexed these characteristics based on findings from two surveys and several analyses of digital, handwritten, and printed texts, offering a wealth of linguistics innovations in English, German, and Hungarian.

ENTER THE EMOTICON ;‑)

The use of emoticons—spelling variants and writing in a style similar to spoken language—is central to digilect. “But the most striking innovations relate to vocabulary,” Veszelszki says. “For example, many new abbreviations and Internet-specific acronyms are used for simplification and time-saving purposes.” Now that written communication is taking over from oral communication—who has not been guilty of shooting an email to a colleague sitting on the other side of the room to avoid having to walk over—we cannot afford to lose too much of our time on crafting messages.

“Who has not been guilty of shooting an email to a colleague sitting on the other side of the room to avoid having to walk over?”

As a result, many neologisms, or “netologisms”, have sprung from the Web, some of which have become commonplace in all layers of society and all manner of communications. Words like “hashtag”, “troll”, “meme”, and “like” had completely different meanings in the previous century. Abbreviations like “omg” (oh my god) and gesture-describing expressions such as “facepalm” were just gibberish. The English language has inspired many netologism in Hungarian, and presumably in other languages too. The word “hack”, for example, is “hekkel” and “like” is “lájkol”. This may be because English dominates the online world, though another possible explanation is that it adds a layer of prestige.

FROM THE VIRTUAL TO THE ANALOG

Digilect is most notable in online communication, specifically in instant messaging conversations and on social media sites like Twitter and Facebook. Especially the latter platform has an important influence on our public and private conversations. “Just think about it: Facebook is still showing a growing trend with nearly half a million new users registering each day,” Veszelszki says. “Every six seconds, six new Facebook profiles are created.”

However, Veszelszki also found that these linguistic innovations have begun to influence the way teenagers formulate messages when they write by hand. Interestingly, when she asked them to fill in a paper questionnaire about their language use, “it was an unexpected outcome that the answers to open-ended questions included many digilectic forms, like abbreviations and smileys. Interestingly, they wrote smileys as if typed on a keyboard and not rotated by 90 degrees to better represent the human face.” To illustrate, they wrote ‘:-)’ or ‘XD’ instead of ‘☺’.

“I was interested to learn whether the new abbreviations invented in digital communication also have an impact on note-taking. The answer is a clear yes.”

Ágnes Veszelszki

Veszelszki delved deeper into this and collected a corpus of student handbooks to investigate the handwritten notes in the margins that students write to converse during class. “I was interested to learn whether the new abbreviations invented in digital communication also have an impact on note-taking. The answer is a clear yes,” says Veszelszki.

Additionally, Veszelszki also found examples of digilect in spoken language, such as those in made-up quotes like this one: “Oh em gee, I lol’d so hard!” (fans of the Kardashian family may be familiar with these). Respondents to her survey said they had used abbreviations like “asap” and “brb” in conversations, as well as the netologisms “lájkol” (“like”) and “lávol”, which means “love” and was inspired by the pronunciation of that English word.

POST-LITERACY: BACK TO AN ORAL CULTURE?

As mentioned earlier, digilect is heavily influenced by oral communication. For example, digital communication resembles dialogue, is less structured than written text, and has looser spelling and grammar rules. This may have notable consequences, given the ubiquity of digital communication, as Western society is heavily based on text.

Veszelszki thinks the evolution of digilect can help offset the constraints that literacy imposes on society. “Some say that the emergence of literacy became a barrier between people, as literate societies lack the spontaneity of orality, and therefore people living in literacy-based cultures have separated from each other both psychologically and emotionally. Today, however, orality seems to have gained ground against literacy, so these ‘values’ are making their way back to society.”

In the post-literacy world of today, literacy has lost its absolute power in favor of spontaneous discourse, which is characterized by immediacy, emotional directness, and vitality. “Planned communication is no longer considered ideal, not even in literature,” notes Veszelszki. An example of this is the cell phone novel, a literary format particularly popular in Japan and China, which is made up of short chapters sent to subscribers via e-mail or SMS message.

BEYOND SLANG

Most speakers of digilect are from younger generations, so it is easy to assume that it is therefore a form of slang that youths will stop using as they grow older and adopt a more formal communication style. However, that is not how Veszelszki sees it. “Digilect is not only used by young people, but by everyone using computer-mediated communication. My research suggests that people from older generations who regularly use the Internet for work or play can also be fluent in the language variety of digital communication.” The fact that so many youths seem to be using this language variety might then simply be because younger generations are better versed in using computers and navigating the online world.

“Digilect is not only used by young people, but by everyone using computer-mediated communication.”

Ágnes Veszelszki

That does not take away the fact that today, many adults will not understand so-called digital natives talking online, or among themselves. This has important educational implications. “For adults dealing with children, it can be useful to be open to their communication style,” Veszelszki says. “This does not mean that they should talk and write like children, but they should at least know about the online content in which children are interested. Education should be better prepared—in terms of methods, tools, and content—for students who use digital technology day and night and sometimes know better than their teachers. It should teach them about the different forms of communication and the registers linked to them.”

It is difficult to predict how these language variations will continue to evolve, as technologies change so quickly that you can never know which new platform will arrive next. In fact, some innovations have already started dying out.

When everyone was still sending SMS messages, the character limitation inspired creative abbreviations, many of which quickly became widespread. Now that such boundaries no longer exist on instant messaging apps, that evolution has been at least partly turned around. What is more, word prediction functionalities can even cause us to write words in full and force us to use correct spellings and sentence capitalization.

Veszelszki prefers, then, to catalog these changes in our language, so that they might be remembered by future generations.

“Perhaps, in a way, this book was the last opportunity to ask, ‘How has the Internet changed language?’” she says. “Some people can no longer imagine life without the Internet; many seem unable to recall how they used to think before the Internet era. For those born in the golden age of the Internet and digital technology, this question is hardly understandable as they have no basis for comparison with the pre-Internet era.”

Maybe one day, they can find out in a Museum of Analog Communication.

This article was originally published at blog.degruyter.com, here.

 

Do science girls have an image problem?

Hot young girls in high heels. Powdered make-up exploding across bubbling and steamy apparatus. Equations written in lipstick. Sounds like a normal day in the lab for most women scientists. Except it isn’t. The scenes are, of course, snippets from the roundly and soundly derided ‘Science: It’s a girl thing’ video released to shock and awe–the bad kind–in 2012.

Born of a well-meaning but inherently flawed campaign from the European Commission, it has been criticised and parodied to the point that further condemnation for reinforcing stereotypes would be like pulling a girl’s hair and stealing her chocolate. Marie Curie’s appearance may arguably not be as attractive as a catwalk models, but if you could find visual props to picture a beautiful mind, she would be a shining star.

To be fair to the EC, it’s not hard to see why they thought they had to do something: a She Figures 2012 report points out that the share of women graduating at PhD level now stands at 46%, but women account for only 33% of researchers in the EU. And while 59% of EU graduate students in 2010 were female, only 20% of EU senior academicians were women.

Is the image of women scientists to blame for the lack of popularity of science studies? It is clear the problems begin before university and academia. The UK’s Institute of Physics has found that for the last two decades in the UK only 20% of physics students past age 16 have been girls, despite about equal success for boys and girls in physics and science exams leading to that point.

How much could changing the image of female scientists do to solve the two problems that persist? Namely, boosting girls’ involvement in science from an early age. And removing the barriers to top positions for female scientists when they get there.

A classic remedy to anyone with an image problem would be to try and alter that image through advertising campaigns. But do these get-girls-to-do-science campaigns really work? “I don’t know,” says Claudine Hermann, Vice President of the European Platform for Women Scientists who in 1992 was the first woman to be appointed as a professor at military engineering school École Polytechnique, in Paris, France.

The trouble is that such campaign often fails to convince. Hermann has spent the past 20 years immersed in the challenge and says there have been plenty of campaigns to convince girls—and boys—to go into science “But they have not been very efficient,” she says, “You cannot know what had happened if campaign had not existed.” Sounds like the perfect area to test a policy change through a randomised trial. Others concur that advertising has obvious limitations. “I don’t think one video will make any difference,” says sociologist Lousie Archer from King’s College London, who, like most, is not a great fan of the ‘Science: It’s a girl thing’ video. But she says she could see what they were trying to do.

Rather, the image problem may just be the tip of the iceberg, where deeply engrained cultural and social perceptions are slow to evolve. “Our research shows the masculine image of science is an issue and ‘girly’ girls are much less likely to aspire to science careers than ‘non-girly’ girls, even though they both like science at school,” she explains. This suggests that young girls displaying an interest for science may fear that they will be regarded as uncool by boys.  “Analysis shows single sex schools are most effective way of getting girls to study physics A level,” notes Archer. “Our surveys of over 9000 primary and 5600 secondary pupils show that the ‘brainy’ image of science is also a key part of the problem and can be particularly off putting for girls,” says Archer. “And social class is as much an issue as gender.”

Confirmation of the overwhelming impact of culture and social influences can be found in Eastern Europe, after the Second World War. The communist ideology dictated equal ability and opportunity between the sexes as society forged on as one powered by engineering and science. Indeed, the EU expert group Enwise (Enlarge Women in Science to East) published a reportconcluding that the availability of childcare facilities and state support for working mothers led to the a significant proportion of well qualified women in high profile roles, particularly in science.

Unfortunately, as the Communist Bloc unravelled so did funding, infrastructure and many of the benefits, although still leaving a higher proportion of women researchers in science than in the West today. This was particularly true in countries with smaller populations, who face challenges such as being frozen out of more competitive, high-cost research programmes. As Hermann notes, “it’s complex.” A Czech report from 2008, provides an update of the recent status of women in science in the Czech republic, Hungary, Poland, Slovakia and Slovenia.

The issue is not just about stereotypes, however, and could also be linked to a widespread lack of knowledge of the high transferability of science qualifications. “Most young people don’t realise science qualifications are useful for a wide range of jobs both in and out of science,” says Archer.

Perhaps, the lack of role model is also to blame. Hermann also says that under-representation of women scientists in the media is also a problem. From TV appearances to museum exhibits, they often fail to recognise the role women play in science.

On the positive side, women already in science today stand a better chance to climb the career ladder than before. Hermann cites programmes in Switzerland and Ireland that led to more women professors. “If there is a state policy and real funds things can change,” she says. “But if you just speak there will be very slow evolution. You need political will.”

And for political action you need increased awareness that there is a problem, which has gained much more prominence according to physicist Athene Donald from the University of Cambridge, UK. She cites the Athena Swan Charter for women in science, applied for and awarded to universities, as an action that “has certainly raised everyone’s awareness and also the stakes.” A 2009 winner of The L’Oréal-UNESCO Awards for Women in Science and a noted blogger on the topic, Donald says actions are needed too.  “This isn’t about generational change. This action will be more important at later career stages, university and beyond.” Actions that might work right now include not writing ‘Science: It’s a girl thing’ in lipstick on the EC revamped website.

Original article published on EuroScientist.com.

How to avoid Martian radiation

Researchers put model predictions of radiation to the test ahead of future manned missions to Mars.

Cold, dry, airless – Mars doesn’t make the most comfortable environment for human exploration. But what makes a manned mission to the Red Planet truly dangerous is its radiation, which is thought to be more than 500 times more potent than here on Earth.

Now, a team based in Germany and the US has made an important step towards predicting when, where and with what strength this radiation will strike. Their work, which has just been published in Life Sciences in Space Research, compares theoretical predictions of different models with actual observations for the first time. This work could one day be used to mitigate the risk to Mars explorers of radiation sickness and cancer.

“Using different models and comparing them to available data allows us to better understand the weaknesses and strengths in those models, and how we can apply them to extend our knowledge beyond the measurements,” explains lead author Daniel Matthiä of the German Aerospace Center in Cologne. “We can use the models to determine hypothetical scenarios, for example, the radiation exposure in shelters, habitats and underground.”

Radiation on Mars is far from consistent. Most of it consists of cosmic rays – high-energy particles shooting from outer space – and these vary with the fluctuating strength of the Sun’s protective magnetic field. But just being able to forecast general levels of cosmic rays isn’t enough, as different high-energy particles have different effects on human physiology, and the geography of Mars means that some places are more exposed than others.

There is already a handful of computer models that can predict, in high detail, the changing radiation field on Mars generated by cosmic rays. However, it is not known how accurate these predictions are.

Matthiä and colleagues have begun to find out, by comparing model predictions with data taken from the Radiation Assessment Detector (RAD) instrument aboard NASA’s Curiosity rover on Mars since 2012. This effort was part of a “Blind Challenge” Model Comparison Workshop organized by Matthiä, Don Hassler at Southwest Reseach Institute in Boulder, Colorado (Principal Investigator of the RAD investigation) and the RAD team. The RAD is supported by NASA and DLR to make exactly these kind of measurements to help improve astronaut safety on future human missions to Mars. Building on a preliminary comparison last year, the researchers asked several modelling groups to predict the radiation environment on the surface of Mars for a two-month period, without seeing Curiosity’s results beforehand. “This is harder than it sounds,” says Matthiä.

Although the researchers are unable to quantify the accuracy of the model predictions yet, Matthiä says they were surprised in some cases by how much the models disagreed with each other and with the observational data. But this will help to improve the model predictions, he says, and ultimately provide more confidence when planning manned missions to Mars.

“Understanding, mitigating and managing the radiation environment for astronauts on a manned mission to Mars is a challenging, but not unsurmountable, problem,” Matthiä adds. “The more we understand about the environment, its variability, and its effect on humans, the safer our astronauts will be.”

Comparisons of model predictions and RAD data are not the only way to study the health effects of space radiation during a manned Mars mission, however. Last year, in another paper published in Life Sciences in Space Research, scientists reported upgrades at NASA’s Space Radiation Laboratory (NSRL) at the Brookhaven National Laboratory in Upton, New York, enabling the effects of cosmic rays to be simulated experimentally with greater precision here on Earth.

Article details:

D. Matthiä et al.: “The radiation environment on the surface of Mars – Summary of model calculations and comparison to RAD data,” Life Sciences in Space Research (2017)

Original article published on Elsevier.com.

Offering a hacking solution for scholarly publishing

Changing incentives to researchers and scientific-centric technology solutions could be the new normal.

Hacking solutions to science problems are springing up everywhere. They attempt to remove bureaucracy and streamline research. But how many of these initiatives are coming from the science publishing industry? There is currently no TripAdvisor to the best journal for submission, no Deliveroo for laboratory reagent delivery. How about a decentralised peer-review based on the blockchain certification principle? Today, the social media networks for scientists—the likes of ResearchGate, Academia.edu and Mendeley—have only started a timid foray into what the future of scholarly publishing could look like.

This topic was debated in front of a room packed with science publishing executives at the STM conference, on 18th October 2016, on the eve of the Frankfurt Book Fair. Earlier that day, Brian Nosek, executive director at the Centre for Open Science, Charlottesville, Virginia, USA, gave a caveat about any future changes. He primarily saw the need to change the way incentives for scientist work so that, ultimately, research itself changes rather than technology platforms imposing change.

Yet, the key to adapting is “down to the pace of experiment,” said Phill Jones, head of publisher outreach, at Digital Science, London, UK, which provides technology solutions to the industry. Jones advocates doing lots and lots of experiments to find solutions to better serve the scientific community.

Indeed, “rapid evolution based on observed improvement is better than disruption for the sake of disruption,” agreed John Connolly, chief product officer at Springer Nature, London, UK.

Adopting an attitude that embrace these experiments, “is the biggest change that we [the scholarly publishing industry] need to embrace,” Jones concluded.

To do so, “we need publishers to be a lot less cautious,” noted Richard Padley, Chairman Semantico, London, UK, providing technology solutions to science publishers. “It is a cultural thing, publishers need to empower their organisation to use technology from the top down.”

So are the lives of scientists about to be changed? Arguably, yes. Resistance from proponents of the status quo may still arise. It may depend on the pace at which science publisher turn into technology service industry. The truth is “users want to see tools that are much more user-centred and less centred around publishers,” argued Connolly. However, “if you ask a scientists what they wanted [in the past], they would have said high impact factors articles,” said Phil Jones. “They thought this is what they wanted because there was no alternative,” Jones added, whereas: “they wanted to have higher impact of their research and have greater reach.”

Clearly, “if you are optimistic about publishers, there is a job for publishers, to synthesise knowledge, to see the relevant content,” said Connolly.

This means taking quite a lot of adjustment to those who pay for content. A download is not a marker of whether you have passed on that synthesised knowledge!

Original article published on EuroScientist.com.

Data privacy: Should we treat data handling the same way we do our own health?

Increasingly digital breadcrumbs are making it possible for others to track our every move. To illustrate what is at stake here, we need to travel back in time. In the pre-computer era, the ability to remain incognito made for great drama in black and white movies. It also opened the door to the perfect crime, without much risk that the perpetrator  would get caught. Yet, when it comes to crime, invading peoples’ privacy could be justified, some argue, for the sake of the greater good and to preserve a stable society.

But now anybody can become the object of intense online and digital scrutiny, regardless of whether they are guilty of some nefarious crime or not.  And there is distinct possibility that digital natives may, in the not so distant future, take for granted that their every move and decision is being traced without any objection.

It is not clear which is more worrying: that future generations might not even question that they are under constant digital scrutiny. Or that it is our generation that is allowing technology to further develop without the safety nets that could secure our privacy now and for the future; this could leave the next generation without any hope of the privacy we once took for granted.

Health offers an insightful comparison.  It may appear paradoxical, but our society appears much more concerned about preserving our physical health than the health of our digital anonymity. Indeed, new drugs are subjected—rightly so—to a very intense regulatory process before being approved. But new technology and the way inherent data is handled is nowhere near as closely scrutinised. It simply creeps up on us, unchecked.

Despite protests from regulators that existing data privacy laws are sufficient, greater regulatory oversight would invariably impact the way data collection is operated. Take the case of data used for research, for example. Experience has shown that even in countries where transparency is highly valued, such as Denmark, there have been deficiencies in getting consent for the use of sensitive personal health data in research, which recently created uproar. By contrast, the current EU regulatory debate surrounding the new Data Protection Directive has the research community up in arms, for fear that too much data regulation would greatly disrupt the course of research.

As for our digital crumbs, it has therefore become urgent to consider how best this data may be managed, at the dawn of the Internet of Things.  Striking the right balance between finding applications with societal relevance and preserving people’s privacy remains a perilous exercise.

Do you believe digital natives are unlikely to be as concerned about their privacy?

Should we  allow technology to further develop without implementing the necessary privacy safety nets?

Original article published on EuroScientist.com.

GDPR gives citizens control over their own data: An interview with Shawn Jensen

Data protection regulation GDPR, includes exemptions to allow research on anonymised data. In this exclusive interview with Shawn Jensen, CEO of data privacy company Profila, Sabine Louët finds out about the implications of the new GDPR regulations for citizens and for researchers. The regulation was adopted on 27th April 2016 and is due to enter into force on 25th  May 2018. In essence,  it decentralises part of the data protection governance towards data controllers and people in charge of processing the data.

As part of the interview, Jensen explains the exemptions that have been bestowed upon certain activities, such as research, so that scientists can continue to use anonymised data for their work while having to ensure the privacy of the data  required by the law.

In fact, the regulations are designed to preserve the delicate balance between the need to protect the rights of  data subjects in a digitalised and globalised world and yet making it possible to processing of personal data for scientific research, as explained in a recent study by Gauthier Chassang, from the French National Health Research Institute INSERM. The study author concludes:

“While the GDPR adopts new specific provisions to ensure adapted data protection in research, the field remains widely regulated at national level, in particular, regarding the application of research participants’ rights, which some could regret. ”

However, Chassang pursues,

“the GDPR has the merit to set up clearer rules that will positively serve the research practices notably regarding consent, regarding the rules for reusing personal data for another purpose, assessing the risks of data processing …” In addition, he continues, “for the first time, the GDPR refers to the respect of ethical standards as being part of the lawfulness of the processing in research” and “opens new possibilities for going ahead in the structuring of data sharing in scientific research with measures encouraging self-regulation development.”

Read the original article, here…

How to live in a post-privacy world: An interview with Michal Kosinski

Is it possible that by giving some sort of privacy, we could have better health or cheaper car insurance rates?

Discussion on privacy were top of the agenda at one of the biggest technology trade fairs in Europe, the CeBIT 2017 in Hannover, Germany. Indeed as social media are full of the many aspects of the lives of those who share their views with as little as pressing a  ‘like’ button on Facebook.

Invited speaker at CeBIT 2017,  Michal Kosinski, who is an assistant professor of organisational behaviour at Stanford graduate school of business, California, USA, shares update on his latest work about the ability to predict future behaviour from psychological traits inferred from the trail made of the many digital crumbs, including pictures we share over the internet.

His latest work has huge implications for privacy. He believes human faces—available from pictures found on social media networks–can be used as a proxi for people’s hormonal levels, genes, developmental history and culture. “It is pretty creepy that our face gives up so much personal information.” He adds: “I would argue sometimes it is worth giving up some privacy in return for a longer life and better health.”

In this context, the regulators don’t work in a vacuum. But regulators cannot guarantee absolute privacy. He explains that it is an illusion for people to strive to retain control over their own data. “The sooner as a society and policy makers, we stop worrying about winning some battles in a privacy war, and the sooner we accept, ‘ok we’ve lost this war,’ and we move towards how to organising society and culture, and technology and law, in such a way that we make the post-privacy world a habitable place, the better for everyone.”

In this exclusive podcast, Kosinski also discusses the constant struggle between top-down governance and bottom-up self-organisation, which leads to a constant trade off in terms of privacy in oursociety. He gives an insightful example, The likes of Facebook with their algorithms would be uniquely placed to match people with the right job, or detect suicide before they happen. However, this possibility raises questions concerning the level of invasion of people’s privacy, which is not socially acceptable, even if it could solve some of our society’s problems.

Finally, Kosinski gives another example where people’s privacy has been invaded for the purpose of changing people’s behaviour. Specifically, he refers to intervention by car insurance industry which has added sensors in cars to monitor the drivers’ behaviour, thus breaching their privacy in exchange for lower premium.

Original article published on EuroScientist.com.

Open Access sector moves slowly to mature

New figures show the relatively limited size and slow rate of uptake of the Open Access market.

Open Access (OA) continues to be the subject of discussion in the scientific community, as do debates about the need for greater levels of open access. However, the reality on the ground is not as clear cut and the adoption rate of OA is not as quick as its promoters would like it to be. At the recent STM Association conference held on the 10th October 2017 in Frankfurt, I presented the findings of a study by independent publishing consultancy Delta Think, Philadelphia, USA, about the size of the open access market. The numbers help unearth recent trends and the dynamics of the OA market, giving a mixed picture. Although open access is established and growing faster than the underlying scholarly publishing market, OA’s value forms a small segment of the total, and it is only slowly taking share. With funders’ showing mixed approaches to backing OA, it might be that individual scientists have a greater role to play to effect change.

Size and growth of the current open access market.

New figures about the size and growth of the current Open access market have been published recently by Delta Think’s Open Access Data & Analytics Tool, which combines and cross-references the many and varied sources of information about open access to provide data and insights about the significant number of organisations and activities in the OA market recently released. They give a sense of how Open access is faring, and what the future has in store for its adoption.

In 2015, the global Open access market is estimated to have been worth around EUR €330m (USD $390m), and grew to around €400m ($470m) in 2016. This growth rate is set to continue, meaning that the market is in course to be worth half a billion dollars globally going into 2018.

Size numbers of open access market in context

To put the sizing numbers into context, the growth rate in OA revenues is much higher than the growth in the underlying scholarly publishing market, which typically grows at a few percent –low to mid-single digits–per year.

However, the share of the OA market is low compared with its share of output. Just over 20% of all articles were published as Open access in 2016. Meanwhile, they accounted for between 4% and 9% of total journal publishing market value, depending on how the market is defined. The OA numbers cover Gold Open access articles publishing in the year, being defined as articles for which publication fees are paid to make content immediately available, and excluding such things as articles deposited in repositories under an embargo period or articles simply made publicly accessible.

Uptake of open access

The degree to which Open access is taking market share is slow. Taking the 2002 Budapest OA Initiative as a nominal starting point for market development, the data suggest that it has taken over 17 years for Open access to comprise one-fifth of article output, and, at best, one-tenth of market value.

The key driver of changes in the OA market remains funders. When choosing which journals to publish in, researchers continue to value factors such as dissemination to the right audience, quality of peer review, and publisher brand over whether their work will be made OA.  Numerous studies have shown this, and suggest similar attitudes towards both journals and monographs.

Movement towards OA therefore happens where funders’ impetus overcomes researchers’ inertia. Funders’ appetites for OA in turn vary by territory, so the outlook for Open access growth of market share remains mixed. Funders in the EU have the strongest mandates, but many in the Far East, for example, incentivise publication based on Journal Impact Factor, regardless of access model, as they seek to enhance reputations and advance careers.

Future predictions of open access

The relatively low value of open access content compared with its share of output poses interesting questions about the sustainability of the open access publishing model. To some, the data suggest that open access is cost effective and could lower systemic costs of publishing. By contrast, others suggest that we need to be realistic about systemic costs and global funding flows.

Further, although open access is clearly entrenched in the market, at current rates of change, it will be decades before open access articles and monographs form the majority of scholarly output. Opinions vary as to whether the transition to Open access is frustratingly slow or reassuringly measured.

The current data suggest that the discussion and debate around open access will continue as they have been. For the average researcher, it is therefore business as usual and so it might be that individual scientists have a greater role to play now to shift the balance, regardless of whether funders nudge them in the OA direction.

 

Dan Pollock

Daniel is Director of Data & Analytics for Delta Think, a consultancy specialising in strategy, market research, technology assessment, and analytics supporting scholarly communications organisations; helping them successfully manage change.

Original article published on EuroScientist.com.

Illustration adapted from a photo by Astaine Akash on Unsplash