Special Report – volTA magazine http://volta.pacitaproject.eu - Tue, 02 Jun 2015 11:32:00 +0000 en-US hourly 1 https://wordpress.org/?v=4.7.26 Connected ageing http://volta.pacitaproject.eu/connected-ageing/ Wed, 17 Dec 2014 12:43:54 +0000 http://volta.pacitaproject.eu/?p=1806 Can technology fill the care gap?

 yoga07sh

With the number of Europeans aged over 60 set to double in the next 50 years, and the number of over-80s to triple, meeting the future care needs (and health costs) of ageing populations is critical. How can telecare, telehealth, robots and ICT help Europe’s senior citizens?

 

Meet Annabelle. She’s a retired lawyer and lives in a large city and has been exceptionally healthy almost all her life. In the last year, however, Annabelle has gone through intensive rehabilitation after a femoral neck fracture. She is still, to a certain degree, physically impaired and has become more anxious after the incident. Annabelle lives alone. Though she has used a computer for many years in her working life, she is not interested in technology or other digital tools. Annabelle finds new software or gadgets ‘demanding’ and she is sceptical about the safety of her personal information when it is processed by the different systems.

From the PACITA scenarios

Annabelle is just one of many European senior citizens who will need care in the near future. Projections show that she could live well into her nineties, alongside many of her peers. Annabelle could be diagnosed with a chronic illness, or with dementia. She is a widow and her children live elsewhere. Who is going to take care of Annabelle?

 

‘It is not possible to introduce new technology and not adapt the systems surrounding it. We need to look at governance, procedures, processes and patient involvement.’

 

Digital empowerment

The EU sees technology as a key solution. In the Digital Agenda for Europe, it expands on ICT’s capabilities for supporting ageing citizens,  revolutionizing healthcare and providing better public service. Mrs Neelie Kroes, Vice-President of the European Commission and responsible for the Digital Agenda recently stated:

“Let us work together to capture the massive opportunities of new technologies – for our people to stay active and independent, empowered and in control. In turn, helping our healthcare systems and our economy. You are all aware of the facts: they are inevitable. People are getting older and more demanding. But getting older shouldn’t mean losing your dignity and independence. If facing a possible health problem, people expect answers about what they themselves can do about it. In a digital age, it should not be impossible to meet that expectation.”

Technology can play an important role in many areas of the care sector, explains Hilde Lovett, project manager at the Norwegian Board of Technology: “It can execute tasks, such as domestic chores, assist with medication or personal hygiene, and remind us of appointments and social occasions. It can increase mobility and active participation in society and help maintain and build social relations.” But we should also be aware that new technology could bring negative and undesired consequences. If visits by healthcare personnel are replaced by technology and remote communication, the risk of loneliness and isolation could be high.

That was one of the fears that came up at the PACITA workshops that took place in ten European countries during the spring of 2014. The workshops engaged stakeholders in discussion with the aim of producing policy recommendations for national and European policymakers on the topic of care for the ageing and technology.

European stakeholder initiatives
In the UK, the Technology Enabled Care Services (TECS) programme, announced in September 2014, aims to create the right commissioning environment to support and encourage the innovative use of technology to improve health outcomes, empower patients, and deliver more cost-effective services. The re-focus from its predecessor, the 3millionlives project, is a result of demands from health and social care professionals for more practical support in evaluating technology-enabled care services. An online tool is to follow later this year. The TECS Stakeholder Forum’s collective views and proposals on how to address the barriers to wider adoption form the basis of the TECS Improvement Plan 2014-17.The EU’s Innovation Partnership on Active and Healthy Ageing has an overall goal of increasing the average healthy lifespan by two years by 2020. They also see great market potential in healthcare technologies, and see stakeholder involvement as a chance to “boosting and improving the competitiveness of markets for innovative products and services, responding to the ageing challenge at both EU and the global level.”

 

 

 

 

 

 

 

 

 

Policymaking for the ageing in Europe

George (79) has dementia. He lives in his own house in a small town but depends on professional help and support in order to live on his own. George was a long-distance bus driver and his retirement pension is relatively modest. He’s in good physical condition and enjoys moving around inside and outdoors. But since dementia causes disorientation, George needs help finding the way. His occasional amnesia makes him dependent on others for managing his personal finances in order to be able to live alone. George enjoys different social activities – but he does not always remember how to take the initiative.

From the PACITA scenarios

 

‘Even though many senior citizens will struggle with health conditions, many will live long and healthy lives. These seniors could contribute in many ways.’

 

Although the EU looks to technology to solve the challenges ahead, there are several barriers that need to be overcome if technology is to contribute to the care sector in a positive way. Stakeholders at the PACITA workshops identified some of these barriers and discussed potential solutions. Since developments in technology move a lot faster than policies, it is important for policymakers to address these challenges as soon as possible.

Health not wealth: managing ageing earlier

Many of the stakeholders involved in the PACITA project emphasized the need for a governmental strategy to serve as the starting point. That strategy should aim at tackling the challenges in the healthcare system but also support societal values and encourage social contact for senior citizens. Stakeholders argued that a governmental health service has to be the starting point, to ensure that everyone receives basic care. On top of this, different approaches to implementing technology and new ways of organizing care services should be considered. Without such a fundamental strategy, stakeholders feared the development of a societal divide: seniors with wealth and who were technologically competent would be far better off  than others.

Privacy and data protection are two issues that were considered very important by stakeholders. If the care sector were to start relying on self-monitoring, home alarm systems and GPS tracking, for example, there need to be new regulations and routines that can handle the growing amount of data which will be generated. Who should be allowed access to these data? Should relatives be able to monitor their loved ones any time they want?

Another recurring theme at the PACITA workshop was the need to take responsibility for our own ageing process and to a greater degree than before. We need to start earlier and talk with relatives and friends about how we want to live as seniors. If the greatest wish is to live at home as long as possible, adjustments that will make this possible need to be made early on. What types of technology could make your everyday life easier? Many people (but not everyone) will be diagnosed with dementia, so there are privacy and ethical implications that should be discussed with your relatives or doctor. How would you feel about your children being able to track your movements with a GPS? Would you be more comfortable moving to a care facility where there are personnel who can watch out for you?

The potential of innovation

To what degree different countries and regions have implemented technology in the care sector differs widely. But there is clearly an emerging trend of using technology – and the industry is flourishing.

César Rubio works with FENIN, the Spanish Federation of Healthcare Technologies, an organization that acts as a link between the industry and public administration in order to improve the health quality of Spanish citizens. He states that although technology is a part of the future, there are other changes that also must be acknowledged. It is not only technology that is changing, argues Rubio, but also the way we deliver healthcare services.

Hilde Lovett agrees with this: “It is unfortunate if the technology is implemented without looking at the bigger picture. How to work smarter, be more efficient and at the same time deliver better services will be a challenge.” Although there is a need to make care services more efficient, the stakeholders engaged in the PACITA project argued that the focus need to be on creating better care, not just on the economic benefits.

There are high hopes for the business potential of technology for care. The European Innovation partnership on Active and Healthy Ageing is one of the initiatives that the Commission has introduced to enhance competitiveness in addressing social innovation. The Commission has financed numerous research and development projects with the aim of building knowledge capacity and environments for innovation in the field of care technology. As stated on the website of the Innovation partnership “…the field of active and healthy ageing has potential for Europe to be in front when it comes to research and innovation.”

Hilde Lovett sees the Commission’s effort as positive. One of the main feedbacks from the workshops in the PACITA project is the need for arenas where knowledge exchange can happen between different types of stakeholders. In order to create solutions and strategies that will work when implemented, they need to be developed in a cooperative manner, taking into account the arguments from policymakers, the industry, employees in the care sector and the end users themselves.

César Rubio looks at the future changes from the industry’s side. We need to change the whole healthcare system, he argues. It is not possible to introduce new technology and not adapt the systems surrounding it. We need to look at governance, procedures, processes and patient involvement. Hospitals today are built around a ‘one size fits all’ approach. For newer concepts of care, flexibility is a must, in order to achieve the necessary changes.

Rubio also sees commitment as an important issue: policymakers need to create a platform where relevant stakeholders can meet and discuss different approaches. But there will also be a need for a clear strategy and decision makers need to show the industry and other stakeholders that they are willing to carry out changes according to the strategy, he argues.

 

‘Life expectancy will continue to increase, yet unhealthy life years make up around 20% of a person’s life.’

 

The need for flexibility was emphasized during the scenario workshops. It became evident that cultural and social traditions in Europe are quite different. It will therefore be important to make systems that have room for national or regional adaptations, so that the changes in the healthcare system have the best possible chance of succeeding.

Cost-effective?

For governments and policy makers designing strategies to deal with the ageing demographic and its impact on care budgets, determining the cost-effectiveness of new technologies is a critical consideration. Recent research from the the Parliamentary Office of Science and Technology in the UK suggests that not only will telehealth and telecare need to be implemented on a ‘large scale’ if it is to be cost-effective but coordination between a wide range of care sectors will also be required for success.

In assessing existing telecare and telehealth initiatives in the UK: “The largest of these showed a potential reduction in deaths among patients, but found that telehealth and telecare did not reduce use of social or healthcare.” Using technology brings risks and acceptance issues. There are many thousands of ‘health’ apps for mobile phones and while those providing diagnostic or dosage information could be considered ‘medical’ devices and therefore covered under the EU Medical Devices Directive, are they being monitored effectively? Interactive devices that self-monitor and manage conditions such as diabetes are predicted as one of the technology growth areas according to the report, also sensors – organic electronics – and the use of neural networks and interpretive systems.

Cost-effectiveness is only one measure of success. Studies that look at patient satisfaction and quality of life following the introduction of technology get mixed results, the report stated. Might a patient with a chronic condition reject a device that continually reminds them of their illness?

Disentangling the effect of technologies within the context of care, the role private companies have to play, even whether those who commission services are in a position to judge whether they are being delivered to high standards, are all important considerations. The conclusion in the UK was cautious: “However the technology develops, it is unlikely to deliver a silver bullet. Successful implementation of new technology will depend on the coordinated efforts of patients, clinicians and workers throughout the health and social care sectors.”

In theory, technologies should free up time and space for more personal care. There have been a number of promising applications.

 

Innovative ageing
Telemedicine, eHealth, ambient assisted living (AAL) and telecare are some of the many concepts describing the use of technology in care services. There is a lot of innovative effort in these areas.. A technology overview made by the PACITA project shows the European care sector has implemented a wide array of technologies ranging from alarm systems, fall sensors and detectors, to smartphone apps and self-monitoring equipment connected to the Internet. Employees in the care sector are also equipped with communication and  administrative devices.

 

Robocare

Robots come in a variety of shapes, sizes and functionality. In Denmark, most municipalities have experimented with robotic vacuum cleaners in nursing homes – an evaluation made by Copenhagen Business School in 2009 estimated that implementing these robots on a regular basis could save approximately one thousand cleaning jobs. A robot vacuum cleaner might not seem like a huge change; using technology to do mundane tasks like this has not received much criticism. Low-tech changes to lighting and flooring can also make a real contribution to creating more dementia-friendly living environments.

A robot that has caused far more debate is Paro – a  robotic seal intended for social contact and for providing emotional stimulation of patients with dementia. “Stress and anxiety in patients with dementia can be hard to treat and demands a lot of attention from care personnel”, explains Hilde Lovett.  “One can use sedatives of course, but what if there was another way that does not include heavy medication?”

Paro was developed in Japan but has been tested and implemented in several European countries since 2003, including Denmark, Germany and the UK. Marketed as the ‘World’s most therapeutic robot’, Paro reacts to touch and sound and responds with small movements and noises. Some have expressed scepticism about the ethics in this, because they feel that the patients are deceived into thinking they are interacting with a living creature, but there are also many arguments in favour of its use. Paro has been well received in many pilots, Lovett explains: “The robot gives the patients a sense of being a care-giver and the response the seniors get from the seal seems to
calm them and stabilize their mood without having to use medication.”

doctor@home

The number of seniors with chronic diseases will increase in the coming years. Seeing that the need for monitoring and frequent check-ups by medical personnel will exponentially increase, several technological solutions have been developed to reduce the pressure on the health services and put more of the care responsibility on patients themselves.

 

‘However the technology develops, it is unlikely to deliver a silver bullet. Successful implementation of new technology will depend on the co-ordinated efforts of patients, clinicians and workers throughout the health and social care sectors.’

 

One of the most common chronic conditions is chronic obstructive pulmonary disease (COPD). The World Health Organization (WHO) predicts that COPD will become the third leading cause of death worldwide by 2030. A technology that has been developed for this condition is a COPD-kit, which allows patients to monitor their condition at home. They measure their vital signs, and answer questions related to their feeling of well being. Many of these telehealth kits also include options for communicating with medical personnel via video or email if a patient needs advice.

Unwelcome illness

Living with a chronic condition can be psychologically difficult. If patients have a bad day, they might seek reassurance from an appointment with their doctor. If they leave it until it is too late, a stressful hospital admission becomes necessary. Daily monitoring can create a continuous overview of a patient’s condition providing important information for medical personnel and can help identify triggers that worsen or improve well being. It could empower patients, giving them a greater feeling of being in control of their own condition. Although being a constant reminder of an unwelcome illness, the patients seem to appreciate controlling it in a comfortable, familiar environment, rather thanhaving to make daily trips to a hospital.

A technology overview made by the PACITA project shows that alarm systems are among the most widespread use of technology in European homes. But we are just at the starting point of really exploring this field. Innovation and implementation projects, developing and testing technology that seniors could use at home, are taking place all over Europe.

Visiting Alma

In Oslo, Norway, ‘Alma’s House’ is a fully furnished apartment which functions as a testbed showroom of assistive technology (AT) for people with dementia and cognitive disorders that can be implemented in the home. These include safetyoriented aids, like fall sensors or smoke and fire detectors, and technology for social contact and communication. These include easy-to-use telephones, calendars and watches with speech implementation. Sensors detect if a resident leaves the house in the middle of the night with tracking devices that can locate residents who are lost. The aim of the showroom is to have a place where decision-makers, seniors and relatives, healthcare personnel and other stakeholders can visit, and see and try different technologies.

Sigrid Aketun works as an advisor at the City of Oslo Resource Centre for Geriatric Care and has been involved with the development of Alma’s house. “The feedback has been great”, she reports. “Since the opening in 2012, we have had approximately 3,000 visitors. They range from decision makers and other actors who plan and organize care services, but we also see groups from senior centres and organizations. The project has also gotten international attention and representatives from five other countries have visited Alma’s house.”

Providing an informal arena where different stakeholders can visit is one of the key successes of Alma’s House. The perceived conflict between ‘cold’ technology and ‘warm’ hands when it comes to care might be overcome with initiatives like this; it’s not a question of either/or. According to Aketun, “Our advantage has been that we have placed the technology in a physical environment which is adapted to the users. It shows how the technology can fit into the homes of seniors and help with everyday challenges without the technology ‘taking over’ the home completely.”

A shared responsibility

anne berit01

Are there, apart from sedatives, other ways to reduce stress in elderly patients?

Kevin (72) and Laura (68) live with their son and his family. Kevin is in poor health and does not go out very much to meet other people, but he is an active user of social media. He is part of a municipal online community for senior citizens where he helps others to choose and use new technology. As a retired engineer he likes to share his technological expertise and has a key role in the community.

Laura is still very healthy and uses her daily trip to the grocery store to keep in shape. On her way to the grocery shop, she checks in on three of her neighbours who are less mobile than herself. After her visits, she clicks ‘ok’ on a smartphone app that sends a message to the local care services. Laura is happy to help the neighbours and also appreciates the free hour of housekeeping she receives in return for looking after other seniors.

From the PACITA scenarios

At several of the PACITA workshops it was stressed how important it is to look at senior citizens as a valuable asset to society and not simply as a burden on the healthcare system. Almost all stakeholders had positive reactions towards an imagined scenario which emphasizes a strong volunteer effort.

“Even though many senior citizens will struggle with health conditions, many will live long and healthy lives”, explains project manager Hilde Lovett. “These seniors could contribute in many ways, whether it is grocery shopping for other seniors, organizing social events or staffing the cafeteria at the care centre. It should be possible, and encouraged, to engage volunteers, both seniors and others, in care work in the future”, Lovett argues.

Stakeholders from all kinds of backgrounds agree that technology can be a solution for many of our challenges. But even more importantly, they highlight that the need to uphold societal values like privacy, dignity and a social network will increase in importance as technology makes its way into the care sector and our homes.

Care services for the elderly will probably look very different in the future. But we can all be part of forming policy for those services, whether we work in the care sector, develop technology, volunteer in our local community – or simply because we all grow older. It is important that national and European policymakers involve a broad spectrum of stakeholders to make the best possible future for Europe’s seniors, particularly in the light of the demographic and economic challenges Europe is facing. Seniors need help and support and although technology can solve some of these challenges, there needs to be a cooperative effort among many stakeholders to create technology and care services that will work together.

 

Ageing in 2025: What choices will we have?

During the spring and summer of 2014, the PACITA project organized scenario workshops in ten European countries, engaging more than 330 stakeholders in discussion about care, technology and the future of ageing. The aim has been to identify policy options for European policy makers, and make recommendations on how we can deal with the dilemmas that will occur when technology is introduced in the care sector.
In addition to the workshops, the PACITA project has studied the current use of technology in different European countries, and how far decision-makers have come in making explicit policies on the topic.

Why engage stakeholders?

Those that are affected, positively or negatively by research, technological development and policy decisions are not always consulted, even though they have a stake in the issue.  Stakeholder involvement is one way of making decisions more robust and socially acceptable and the variety of voices will make the discussions open to different kinds of knowledge, perspectives and dilemmas. PACITA stakeholders included those from backgrounds such as local decision-making, the care sectors, IT, volunteers and representatives from senior organizations. They discussed and identified challenges and possibilities related to the implementation of technology in care.

Future scenarios

To create a common starting point for the stakeholders, the PACITA partners developed a set of future-oriented scenarios that served as a starting point for the discussions. The scenarios described different ways of organizing and funding care services, and different ways of using technology to increase the quality of healthcare for senior citizens. The scenarios also included stories presenting fictional characters, describing the way their everyday life is affected by the choices politicians make.
Scenarios are a great tool for facilitating forward-looking discussions and using fictional stories forces participants to consider different ways of organizing healthcare services, by giving direct feedback on the scenarios. The fictional characters can be used to show ethical and social dilemmas that seniors might experience in their everyday life, and how different ways of implementing technology can create different dilemmas.

Ethical issues

Could care technologies be experienced as intrusive or unpleasant surveillance? How is privacy balanced against feeling secure? Will using technologies result in senior citizens feeling more or less isolated in their communities?

http://wp6.pacitaproject.eu

Further reading

WHO Facts about ageing/Age-friendly world

WHO publish data on many aspects of global ageing. A new website, Age-friendly World, launched in October 2014, aims to highlight initiatives in cities and communities that make life easier and more enjoyable for older people.

www.who.int/ageing/about/facts/en/

http://agefriendlyworld.org/en

Digital agenda for Europe – Ageing well with ICT

The focus of EU policy is that ICT can help older people to stay healthy, independent and active at work or in their community.

http://ec.europa.eu/digital-agenda/en/policies-ageing-well-ict

European Innovation Partnership on Active and Healthy Ageing

Brings together a wide array of stakeholders, shared interests and projects, geared towards achieving common goals and promoting successful technological, social and organizational innovation.

http://ec.europa.eu/research/innovation-union/index_en.cfm?section=active-healthy-ageing

European Commission: the 2012 ageing report

The economic and budgetary predication for Europe’s ageing population.

http://ec.europa.eu/economy_finance/publications/european_economy/2012/pdf/ee-2012-2_en.pdf

Almas Hus (Alma’s House)

A 50 sq meter flat, opened in 2012, which is a dementiafriendly environment demonstrating assistive technology (AT) to support people with cognitive impairments and dementia, and which also services as a knowledge service on AT.

www.almashus.no

www.aldringoghelse.no/?PageID=4668&ItemID=3651

FENIN – Spanish Federation of Healthcare Technology companies

A multi-sector federation of manufacturing, import and distribution companies, and associations of healthcare technologies, who supply to all the Spanish healthcare institutions.

http://fenin.es

TA projects on ageing and technology

PACITA – Ageing Society

PACITA has put into practice cross-European stakeholder involvement into debating ageing and provided both national and EU level policy makers with substantial input for meeting the societal and technological challenges and opportunities of an ageing population. The workshops took place during the spring and summer of 2014, and the policy advice will be presented to European policy-makers in January 2015.

http://wp6.pacitaproject.eu

Telehealth and Telecare

UK Parliamentary Office of Science and Technology POST note 456, 14 February 2013, Peter Border

Current UK telehealth and telecare initiatives and the role they may play in delivering future care.

www.parliament.uk/briefing-papers/POST-PN-456/telehealth-and-telecare

Ambient intelligence and healthcare

Report from the Rathenau Instituut, 2009

Identifying desirable applications of ambient intelligence in the field of healthcare, together with potential problems or pitfalls.

www.rathenau.nl/en/publications/publication/ambient-intelligence.html

The future of ageing and new technology

Report from the Norwegian Board of Technology, 2009

The report gives an overview of the possibilities and challenges related to welfare technology and the possibleimplementation these in Norway.

http://teknologiradet.no/english/more-care-with-bettertechnology/

 

Text: Marianne Barland
Photos: Fotolia, Iván Barreda  (Homepage image), Ellen Lande Gossner

 

]]>
Private health? http://volta.pacitaproject.eu/private-health/ Fri, 09 May 2014 10:03:58 +0000 http://volta.pacitaproject.eu/?p=1627 The success of Public Health Genomics – using genome-based information and technologies for the benefit of public health – is dependent on access to vast biobanks of data. But how, when and with whom should our DNA and medical data be shared? How can we protect patients’ genomic data without stifling research?

Last year, Yaniv Erlich, a 34-year-old ex-security specialist turned computational biologist, rocked the genomics world by showing that it is possible to discover the identities of anonymous people who participate in genetic research studies. He did so by cross-referencing their genetic data with surnames found on the internet.

Earlier studies, such as by Nils Homer in 2008, had already shown that people listed in anonymous genetic databases could be unmasked by matching their data to a sample of their DNA. All that was needed was some DNA obtained from a discarded paper coffee cup, and open source Genome Wide Association Studies (GWAS) datasets to associate particular individuals with specific diseases.

But Erlich, named ‘The Genome Hacker’ by science magazine Nature showed something else: that it is possible to identify people by linking their genetic data to freely available information. All it took was an internet connection and a smart piece of software – an algorithm called lobSTR, vulnerabilities in databases that hold sensitive information on thousands of people around the world.

‘With the current speed of genomic advances…everybody, be it on a personal level, or at a European decision making level, needs to start forming an opinion on it.’

Privacy is dead

‘In the genomics era, privacy is dead’. This phrase was heard several times at the PACITA Policy Hearing on Public Health Genomics that took place in January in Lisbon. It did not raise many eyebrows, though. The hearing, which aimed to address pressing political issues related to genomic technologies, brought together an eclectic gathering of international geneticists, technology assessment practitioners, public health experts, parliamentarians, jurists, patient representatives and policymakers. The PACITA ambitions were high: it wanted to bring stakeholders and politicians together to set a policy agenda for the ‘responsible introduction of Public Health Genomics’.

76948-2014-01-18-IMG_1126

2014: The PACITA policy hearing on Public Health Genomics in the Portugese parliament brought together politicians, experts and TA practioners.

Perhaps there was simply too much to discuss, that day in Lisbon, and too many aspects of public health genomics to deal with. When politicians were asked at the end of the day, what they would take home as a pressing political issue, a member of the European Parliament said: “With the runaway costs of spending on public health, and genomics being a possible tool for that, it is an interesting topic. But when I get home, nobody is going to care about what I learned today, because they are only interested in day-to-day politics.” Another one answered: “Well, I guess I’ve got more questions now than I had before I came. I had a naïve hope that the experts would tell me what to do. I have got a kind of picture of where we’re going, but no clue of how we will get there.”


The ultimate identifier
Human DNA sequence is often called the ‘ultimate identifier’. Our DNA is unique (except for identical twins). It can be used to predict a variety of medical conditions and traits. Think: hair, skin and eye colour, facial features, height, and, allegedly, even smoking habits. Recent studies suggest that our genes steer our voting and economic behaviour, and our ability to stick to our spouses in marriage.


Compelling reasons

There are many compelling reasons for politicians to meet with geneticists and other stakeholders, though. In less than a decade, whole genome sequencing (WGS) has moved from a revolutionary moon-landing-style science project to a worldwide seedbed of entrepreneurial activity. The price of DNA sequencing technologies is dropping so rapidly that experts believe our children will have their full genomes read as part of their medical record, and as part of a completely new model of personalised public healthcare. But alongside the anticipated public health benefits, come lots of ethical and legal issues. How, when and with whom should we share our DNA and medical data? Policymakers and researchers will need to tread very carefully in crafting policies that protect patients’ genome data without stifling research.

DNA and privacy

The privacy and confidentiality concerns about whole genome data go a lot further than a simple decision about whether to have your genome sequenced for private medical reasons, and if so, whether these data should become part of your medical record. Sequencing a whole genome is still primarily done for research purposes, as scientists piece together which genetic mutations play a role in diseases. This often happens in large-scale (cohort) studies, some of them open source, which compare the DNA of two large groups of individuals, one healthy control group and one case group affected by a disease. Whole genome research involves the collection and storage of a biological sample, the sequencing of the genome, data analysis, and, more and more frequently, the release and exchange of these data in scientific databases to facilitate research.

But these processes are complicated by a number of other issues: the quantity of genetic information that is made available to commercial parties and public private partnerships, the commercialization of research results, the combination of genetic data and electronic health files, out of date consent procedures, data anonymisation capabilities and many other privacy concerns.

Monetarisation of medical data

The day after the PACITA hearing in Lisbon, some of the urgent political issues surrounding Public Health Genomics – and it’s dependence on big data – hit the headlines, as British newspaper The Guardian reported on the potential commercial availability of medical records in the new care.data service proposed by the National Health Service (NHS England). The scheme aims to ‘join up’ doctors’ and hospital medical records of millions of citizens in order to improve medical services. To help diagnose drug side effects, for example, and evaluate the performance of hospital surgical units and procedures, by tracking their impact on patients. The newly established data services provider, the Health and Social Care Information Service (HSCIS) will have the legal right to extract data from GP Practices – a process that was meant to begin this month (see below). The extracted information is anonymised to some extent – stripped of the patient’s name – but the records do contain (parts of) a person’s NHS number, date of birth, postcode, ethnicity and gender. The care.data plan is part of a huge governmental investment scheme that aims to boost the British life sciences industry. Collecting, storing, and analysing ‘national healthcare, public health and social care data, including personal data’, should make the UK, the ‘leader in the race for better tests, better drugs and above all, more personalised care to save lives’, as Jeremy Hunt, UK Secretary of State for Health stated.

‘Care.data needs to work: in medicine, data saves lives’

What’s gone wrong with care.data?

Widespread unease expressed by both the medical profession and patient groups have led to the roll out of the scheme being put on hold for six months.

Care.data has run into a volley of privacy accusations. Would police and government bodies have the right to access people’s medical data? Even when medical data has been anonymised, how easy is it piece together evidence to identify an individual and thereby discover information about them from their health record? It doesn’t help that the NHS has form in the sloppy care of medical records. Right wing newspaper The Daily Mail, reported that in 2012, the NHS ‘lost track of 1.8 million confidential patient records in a single year’. ‘Sensitive’ paper records have been dumped in public bins and landfill sites. Computers containing medical records were found for sale on ebay, the newspaper reported. The opt-out procedure, rather than an opt-in scheme did not pass muster either. Patients who want to opt out of care.data need to arrange it with their GP. And it is not clear what data will be blocked. According to some experts, it is inevitable that the medical data of all Britons – whether with consent or not – will be sucked into the database. What seems universally agreed is that patient awareness – of the benefits as well as privacy implications – is at very low levels. Organisations such as the British Medical Association have therefore welcomed the delay. ‘Care.data is in chaos’, wrote Ben Goldacre in The Guardian in Feburary. “HSCIC needs to regain trust, by releasing all documentation on all past releases, urgently. Care.data needs to work: in medicine, data saves lives.”

DNA and medical records

Selling sensitive medical data of citizens is one thing that would need careful democratic deliberation. But when DNA is to become part of electronic health records – as envisioned by Personal Health Genomics enthusiasts – many believe that public awareness and political scrutiny becomes even more important. In the UK, according to British NGO Genewatch, it is indeed the ultimate aim of the British Government to have the genomes of all 60 million Britons sequenced and attached to their electronic health records. In July 2013, the Department of Health announced the launch of Genomics England and the start of the 100k Genomes Project. Over the next five years, the personal DNA of up to 100,000 NHS patients will be sequenced. A spokesperson for this project said: “This unrivalled knowledge will help doctors’ understanding, leading to better and earlier diagnosis and personalised care. Based on expert scientific advice, we will start by tackling cancer, rare diseases and infectious diseases.” UK prime minister David Cameron stated: “It is crucial that we continue to push the boundaries and this new plan will mean we are the first country in the world to use DNA codes in the mainstream of the health service. By unlocking the power of DNA data, the NHS will lead the global race for better tests, better drugs and above all better care.”

‘The development of biobank infrastructure and the use of this as a basis for personalised medicine has become a central strategic goal in the fields of European biotechnology, genomics and international politics’

For researchers, the linking of genomic data to data from medical health records is crucial. To fully understand what triggers disease or not, DNA has to be linked to other factors, such as health data and lifestyle and social and environmental factors. To figure out what exactly causes lung cancer, one would not only study genes and smoking habits but also demographics: postal codes, to see if air pollution plays a part in developing disease.

The collection of bio specimens, like samples of urine, blood, tissue, cells, DNA, RNA, and protein and other data, for research purposes is nothing new. It has a long history in educational and medical systems, remaining largely uncontroversial, hidden away in the seclusion of pathology institutes. But with recent genomics advances, big data claims and booms in IT technology, the potential of opening up existing collections of bio specimen in biobanks, or starting new collections, has become a feverish pursuit. All around the globe, governments and companies are rushing into ambitious projects to find out how genome technology can best be used in a medical context. Huge data sets, with the DNA of hundreds of thousands of people, are needed to uncover genetic links (that have so far proved elusive), but that are needed to address the promise of personal medicine. As a report by the European Commission states: “The development of biobank infrastructure and the use of this as a basis for personalised medicine has become a central strategic goal in the fields of European biotechnology, genomics and international politics.”

World leaders

According to the European Commission, EU member states already are ‘world leaders in the development of biobanking infrastructure to support research, making huge investments each year to support such initiatives’. To give some examples: over the past few years, the UK Biobank has recruited 500,000 people aged between 40-69 years to provide blood, urine and saliva samples for future analysis. They provided detailed information about themselves and agreed to have their health followed for a long time. The Faroe Islands, an autonomous country within the Kingdom of Denmark, is offering genomic sequencing to all of the citizens of this archipelago, to understand the particular genetic diseases prevalent in this isolated population.

The same is happening elsewhere. In November 2011, the Beijing Genomics Institute launched the Million Human Genomes Project, to decode the genomes of over 1 million people for projects based in China and abroad. In the US, the US Department of Veterans Affairs (VA) has been collecting the medical records and blood samples of a million U.S. veterans since 2012. Dr. Joel Kupersmith, the VA’s chief research and development officer told US newspaper The Baltimore Sun that researchers: “have long seen the potential at the VA because the system has 8 million enrollees of various ages and ethnicities with most every kind of age, health, and service-related disorder. All have an electronic medical record stretching up to 15 years.”


Personalised medicine?

DNA is the basis of all life on earth, including human life. The methodologies for reading the DNA sequence – to unravel the code of life – are currently undergoing revolutions in both speed and cost. The first complete sequence of the human genome, completed in 2003, took more than a decade to complete, at a price of roughly 3 billion euros. These days, it takes roughly 2-4 weeks to read the full DNA of a human at a cost of 2,000-5,000 euros. Within the next five years, it is expected to take just one day, for less than 500 euros. According to genomics believers, the two major consequences will be: that medicine will become genome-based, personalised: products, tests and supplements tailor made to our unique building plan.

The second consequence will be an increase in ‘predictive’ diagnosis: our DNA could be used to reveal the strong and weak points of our body, our talents and any hidden risks including genetic diseases. Although we are still in the early days of understanding the genetic code, progress is being made regarding disease-associated DNA variants. Genome-based research is already enabling medical researchers to develop more effective diagnostic tools, to better understand the health needs of people based on their individual genetic make-ups, and to design new treatments for disease. Most new drugs based on genome-based research are estimated to be at least 10 to 15 years away. It is very difficult to predict how much of our lives will by driven by our DNA in the end. From what we know now, both disease and health stem from a combination of our DNA, our environment and our lifestyle.


Reluctant donors

A big problem with biobanking, though, is that in Europe, hardly anybody knows about it. A 2010 Euro barometer survey on life sciences and biotechnology, conducted in 32 European countries, showed that more than two thirds of all Europeans said they have never heard of biobanks. When there were told, many European citizens appeared to be reluctant to become donors or participants in cohort studies. According to the survey, concerns about privacy and confidentiality are the first things that spring to mind. Although many people these days seem willing to post their intimate information on social networks, medical data are still considered as being highly sensitive. Sharing medical information, illnesses and ailments, is tightly connected to the doctor-patient relationship, and the fundamental right of medical confidentiality. People also fear that the long term storage of their data could be turned against them, through the violation of their privacy rights or through discrimination by insurers or employers.

Consent

One of the most controversial aspects of biobanking is the current use of ‘broad consent’, for those enrolling in a biobank rather than seeking ‘informed consent’ from participants. For practical reasons, according to the EU Commission, ‘broad consent is now the norm for biobank recruitment’. Participants are asked to consent once to the broad use of their samples and data, rather than to specific, new or future research projects. This is not simply because researchers are reluctant to add extra barriers to their work and would prefer to spend time on their research rather than wrestle with added layers of bureaucracy. Sometimes asking for renewed consent has no benefit for patients; sometimes it is simply impossible. This is a problem for studies that use archived samples, for example. These samples were often collected using a consent process (if a consent process was used at all) that did not anticipate the potential identifiability of genomic data. But the use of broad consent for  biobanking creates tensions because “data may be shared with large numbers of researchers, including commercial companies, both nationally and internationally, for purposes which may be unclear when the data sets are collected.” according to the PACITA Expert report. A recent international study involving European wide focus groups was very clear on consent. Despite the perceived (research) need for broad consent, a large majority of Europeans (67%) would choose narrow consent and only 24% broad consent. As the authors observed: “It was a minority of people who thought it appropriate not to be asked for permission to have their details and samples entered in a biobank.”

‘China has by far the biggest genomics industry worldwide, and we exchange lots of data with the Chinese, but do you think they care much for privacy or human rights?’

Lacking rules

Research and biobanking communities have a long tradition in successfully guarding privacy and medical confidentiality and are scrambling to maintain this within a big data environment. They are setting up research ethics boards and data access committees, reviewing and publishing codes of conduct and governance mechanisms, and developing encryption and key management systems to restrict data access. But at present, genome researchers simply have no model to follow for protecting the privacy of genetic donors. As one geneticist quietly joked at the PACITA hearing: “China has by far the biggest genomics industry worldwide, and we exchange lots of data with the Chinese, but do you think they care much for privacy or human rights?”

The problem is that there is no clarity on consent procedures, and no consistent and coherent rules in the areas of privacy, data protection, the use of human tissue in research, and the exchange of these data across national borders. There are big differences between the implementation and enforcement of legal provisions, even among the EU countries that have signed the Data Protection Directive. Data protection rules are currently being revised by the new Data Protection Regulation, which is expected later this summer. However, biobank managers have en masse expressed concern that too strict a regulatory framework for human biobanks within Europe will create uncertainty and inhibits the building of a biobank infrastructure.

Heightened tensions

76970-2014-01-18IMG_1191

Jens Henrik Thulesen Dahl (Denmark), Maria De Belém Roseira (Portugal), Yvonne Gilli (Switzerland), Vittorio Prodi (Italy).

The current regulatory vacuum leads to heightened tensions between the individuals’ need for privacy and confidentiality, and the needs of researchers and biobankers for their pursuit of a societal benefit. Is asking for ‘broad consent’ in a genomic era ethically appropriate, and if it is, how should it be handled? Too strict a focus on privacy and confidentiality is likely to hamper research. Many researchers feel that because of privacy rules, a lot of what is learned from genetic studies is neither published nor shared, and is therefore lost. David Altshuler, deputy director of the Broad Institute of MIT and Harvard, recently said in Scientific American: “There are literally millions of people who participate in medical research, and probably over a million people whose genomes have been characterized in some way or another, where the data is not freely available precisely because of privacy concerns.”

Biobanks are often paid for with taxpayers money and strongly depend on public support – if only for donations of samples and data. As the EU Commission report notes how these are controversial undertakings: “Not all biobank projects are warmly reviewed by all groups in society.” US President Obama’s Bioethicist Commission reported in 2013: “Without public trust, people may not be as willing to allow scientists to study their genetic information.” Securing acceptance and public trust, by creating awareness and transparency, and by finding solutions to balance a range of competing interests, is crucial to a successful translation of genome-based technology from research to the clinic. Weighing competing interests, setting boundaries, and finding a balance between protecting individuals’ privacy and the greater good, is what politicians should do.

‘There are probably over a million people whose genomes have been characterized in some way or another, where the data is not freely available precisely because of privacy concerns.’

Awareness

The need for policy makers to address these public awareness issues more rigorously was emphasized at the PACITA hearing. Stressing the importance of informed citizens, awareness and education, Klaas Dolsma from the Dutch Erfocentrum, the national information centre on genomics and hereditary diseases, said: “With the current speed of genomic advances, it is for sure that at some time in our lives, each and every one of us will have to make decisions about genetic testing and hereditary disease. So everybody, be it on a personal level, or at a European decision making level, needs to start forming an opinion on it.”

 


Read more?

The Genome Hacker – Erika Check Hayden, Nature (2013)

Think tank on Identifiability of Biospecimens and -Omic Data, US Department of Health and Human Services (2012)

Biobanks for Europe – A Challenge for Governance, European Commission (2012)

Privacy and Progress in Whole Genome Sequencing, US Presidential Commission for the Study of Bioethical Issues (2012)

Whole Genome Sequencing: Innovation Dream or Privacy Nightmare? E. De Cristofaro (2012)

Data storage and DNA banking for biomedical research: informed consent, confidentiality, quality issues, ownership, return of benefits. A professional perspective – B. Godard et al (2003)

Public Access to Genome-Wide Data: Five Views on Balancing Research with Privacy and Protection –P3G Consortium Church et al (2009)

Resolving Individuals Contributing Trace Amounts of DNA to Highly Complex Mixtures Using High-Density SNP Genotyping Microarrays, N. Holmer (2008)

PACITA – Expert Working Group Report, Expert Paper and a Policy Brief (2014)


 

Text: Pascal Messer

Photos courtesy of the Portuguese Parliament

]]>
Big Data http://volta.pacitaproject.eu/big-data/ Wed, 23 Oct 2013 15:38:58 +0000 http://volta.pacitaproject.eu/?p=1388 Locating crime spots, or the next outbreak of a contagious disease, Big Data promises benefits for society as well as business. But more means messier. Do policy-makers know how to use this scale of data-driven decision-making in an effective way for their citizens and ensure their privacy?

 

The supercomputer MareNostrum, located in an old chapel built in 1920, is one of the fastest supercomputers in the world and the main resource of Barcelona Supercomputing Center

90% of the world’s data have been created in the last two years. Every minute, more than 100 million new emails are created, 72 hours of new video are uploaded to YouTube and Google processes more than 2 million searches. Nowadays, almost everyone walks around with a small computer in their pocket, uses the internet on a daily basis and shares photos and information with their friends, family and networks. The digital exhaust we leave behind every day contributes to an enormous amount of data produced, and at the same time leaves electronic traces that contain a great deal of personal information.


Digital exhaust
The digital traces we leave behind when using digital services


Until recently, traditional technology and analysis techniques have not been able to handle this quantity and type of data. But recent technological developments have enabled us to collect, store and process data in new ways. There seems to be no limitations, either to the volume of data or technology for storing and analyzing them. Big Data can map a driver’s sitting position to identify a car thief, it can use Google searches to predict outbreaks of the H1N1 flu virus, it can data-mine Twitter to predict the price of rice or use mobile phone top-ups to describe unemployment in Asia.

The word ‘data’ means ‘given’ in Latin. It commonly refers to a description of something that can be recorded and analyzed. While there is no clear definition of the concept of ‘Big Data’, it usually refers to the processing of huge amounts and new types of data that have not been possible with traditional tools.

‘The new development is not necessarily that there are so much more data. It’s rather that data is available to us in a new way.’

The notion of Big Data is kind of misleading, argues Robindra Prabhu, a project manager at the Norwegian Board of Technology. “The new development is not necessarily that there are so much more data. It’s rather that data is available to us in a new way. The digitalization of society gives us access to both ‘traditional’, structured data – like the content of a database or register – and unstructured data, for example the content in a text, pictures and videos. Information designed to be read by humans is now also readable by machines. And this development makes a whole new world of  data gathering and analysis available. Big Data is exciting not just because of the amount and variety of data out there, but that we can process data about so much more than before.”

In Victor Mayer-Schönberger’s book, Big data. A revolution that will transform how we live, work and think, the concept of ‘datafication’ is used to describe this development: that information is transformed into data so that it can be organized and analyzed by machines. Location is one form of information that has really been embraced as an important element in Big Data analysis.

Although location is not a new kind of data, its value has grown with the fact that many of us walk around with a GPS in our pockets. Tracking the movements of citizens in city planning, giving coupons and advertisements to customers nearby or finding the nearest bus-station and calculating travel time to your next destination, are only a few examples of services that are using location as one of their key elements. Combined with the fact that we are willing to share our location with a lot of different services, it has become a very important kind of data in many types of analysis.

More and messier
Victor Mayer-Schönberger describes some characteristics that can help explain Big Data and its possible use. The first is related to the amount of data accessible. Traditionally, researchers and analysts relied on a sample to do their analysis. Now, we have the technology to gather and analyze much more data – is some cases even ALL the data about a phenomenon. Having an enormous amount of data (like for example in Google or Facebook) gives us the opportunity to explore and examine details of the dataset, something that was never an option when working with a sample, simply because the amount of data was too small.

The second characteristic is something Mayer- Schönberger calls the ‘messiness’ of data. As the scale of information increases, so does the number of inaccuracies. In a sample, it is important that the figures are as correct as possible. With big data, Mayer-Schönberger argues that the amount of data gives us a more valuable output, even though more errors may occur.

A tendency to move from causality to correlation is the third characteristic described by Mayer- Schönberger. New data-mining techniques can give us information about what is happening, without explaining why. Even though certain situations will demand causal explanations, the correlations are enough in many situations. Correlation shows us the relationships between data. This relationship, depending on its character, gives us the possibility to predict certain events. In Mayer-Schönberger’s words: “Correlations help us capture the present and predict the future”.

BIG DATA copia

Supermarkets are (predictably) users of Big Data analysis and prediction based on a person’s shopping habits. Target, a US chain of retail stores, has collected data about their customers’ shopping habits for many years.

In 2010, a man entered a Target store, furious that they had been sending his teenage daughter pregnancy and baby related advertisements in the mail. Why would Target encourage teen-pregnancy like this? The Target manager didn’t know what had happened and apologized to the father. A few days later the manager decided to make a follow-up phone
call, and was met by an embarrassed father. It turned out that his daughter was indeed pregnant, and the father admitted that he hadn’t known the whole truth.

Recently Target have started using Big Data techniques to analyze the huge amount of data they have collected, with the aim of sending more personalized advertisements to customers. In following their shopping habits, analysts found several interesting correlations:

“Women on the baby registry were buying larger quantities of unscented lotion around the beginning of their second trimester. Another analyst noted that sometime in the first 20 weeks, pregnant women loaded up on supplements like calcium, magnesium and zinc. Many shoppers purchase soap and cotton balls, but when someone suddenly starts buying lots of scent-free soap and extra-big bags of cotton balls, in addition to hand sanitizers and washcloths, it signals they could be getting close to their delivery date.”

Knowing and acting on these data, Target sends out customized advertisement to women who, according to their analysis, are pregnant. This is how they knew a teenage girl was pregnant, even before her father. This story shows how unstructured data are now used in complex analysis; it is not only how often and when we buy something that can be analyzed, but also the content of our shopping bags.

‘Computers can calculate where and when future incidents are likely to happen. They can be surprisingly precise, allowing the police to be on site before anything actually happens’

Another example of use of unstructured data is more recent. During the Boston marathon bombings in April 2013, the Boston police adopted a new approach to data gathering in their investigation. Crowdsourcing is a term that most commonly describes co-founding of new products or services on websites like Kickstarter.com; everyone who likes an idea or concept can donate money to help get it into production. In Boston, the police used crowdsourcing to gather crime information and asked everyone who had pictures or video of the bombings to send them in. A kind of digital ‘tip hotline’. But unlike the usual telephone line where witnesses call in and a lot of information gets lost, getting the pictures or videos directly into their system helped the police establish an overview and timeline very quickly.

In addition to tips from the public, the police gathered data from social media including tweets and the location of the tweeters. This use of unstructured data is one of the truly innovative elements of Big Data, and will probably continue to grow, both in the police and other sectors.

Picture a scene from the movie Minority Report. Tom Cruise’s character works in a police unit which knows how to predict a crime before it has happened. They are present at the (future) crime scene before something happens, and arrest people for the crimes they intend to commit. This might have been science fiction in 2002 when the movie came out, but it is now a reality in many police districts all over the world.

Illustration by Birgitte Blandhoel. This illustration shows hotspots that can be used to determine where the crime risk is the highest

Every day, the police gather huge amounts of data, both for operational and investigative use. Over time, these data can create a picture – both of the developments in criminal activity and of how the police do their work. In a future where the data gathering will increase, it gets even more important to use these data in decision making.

After the terror attacks in Norway in 2011, the Norwegian Board of Technology (NBT) launched a project called ‘Openness and Security after the 22nd of July’. The Norwegian police and intelligence service were severely criticized in the official report investigating the terror attacks and a significant element related to the way the police used and analyzed data – or more specifically, the fact that they don’t. Making better use of data they already have – and how they can harness new data from smartphones, social media and other sources – is one of the topics the Norwegian Board of Technology examined in their project.

Project manager Prabhu explains the concept behind predictive policing: “By feeding criminological models with both crime data and data from other sources, computers can calculate where and when future incidents are likely to happen. The predictions can be surprisingly precise, allowing the police to be on site before anything actually happens. Predictive policing models don´t just say crime is likely to happen on this street, because that is what has happened in the past, but because a number of factors come together at that precise moment to make that particular spot a high-risk environment.


Predictive policing
Predictive policing is the use of criminological data models and historical crime data to quantify the probability of where and when future crimes will occur. This information is presented in a way that is useful for the operational and strategic activities of the police (such as a map for patrolling police officers showing likely hotspots). To take this to the extreme, it also means that the police are no longer led by ‘facts’, but by probability calculations created by complex algorithms. In this respect this is a new way of thinking about police work.


Knowing when and where the risk of crime is highest can increase the effectiveness of police work considerably. A ‘when-and-where’ analysis takes into account not only areas where there have previously been a number of crimes, but also when they occurred. An analysis like this can be visualized as so-called hot-spots: a map that shows police patrols where the highest risk of crime is at a given time.

Criminal history
The Netherlands is one country where the police have started using Big Data techniques that analyze criminal history together with the time of the day or week, the weather, geographical data and sociodemographics. The police in Amsterdam have used this method when the country celebrates its annual event, Queen’s Day. Analyzing data from previous years, they were able to make a detailed plan of where and when to position themselves during the day, to be visible to the audience and (hopefully) prevent crime from happening.

The NBT’s project discusses several examples of data-mining techniques and uses of Big Data: crowdsourcing information, function-creep, datasharing and real-time information. But while these are new and exciting tools to examine, project manager Prabhu states firmly that these analyses only show part of the picture: “Predictive analysis is not a crystal ball that tells you the future, but mathematical models that express the probability of an incident happening based on certain theories and environmental elements. If you see a correlation between a weather pattern and a certain type of crime, it might be silly not to act on it.” But it is important to remember that correlation does not
equal causation and that you need insight into the data. You need knowledge about the models behind the analysis and the kind and quality of the data being used. This kind of analysis and use of data is still somewhat new, states Prabhu, and it will take time for its practice to mature.

Privacy in the age of Big Data
When discussing Big Data it is impossible not to touch upon how this challenges our privacy. In 1995 the EU defined personal information as: “Any information that could identify a person, directly or indirectly”, and an important principle in privacy legislation has been that you should be able to decide who collects your personal information, and when and how they are allowed to use it by giving your consent.

Although this is a good principle, it is not able to take into account the explosion in data production that has happened the last years. As Mayer- Scönberger argues in his book, not all Big Data sets contain personal data. But now there are more types of data that are able to identify you than before, because different datasets can be linked.

‘Correlations help us capture the present and predict the future’

Previously, your name, address and social security number were typical examples of personal data. Now, you can also be identified from your location, shopping habits, movie preferences or Facebook network. Only a small amount of information is needed to identify a person from their digital exhaust. By capturing and combining more data, reidentification is easy: even if you are ‘anonymous’ in one dataset, you can be re-identified by linking this to another set of data.

The movie rental and streaming company Netflix learnt a very expensive privacy lesson in 2006. They launched a contest with a million dollar prize, for anyone who could improve (by ten percent) their engine that predicted users’ film recommendations. At the same time they released a dataset of 100 million rental records to help the developers. Personal information like name, user name and IP-address had been removed, but researchers at the University of Texas at Austin compared the data from Netflix with reviews from IMDB (the Internet Movie Database) and found matches between the anonymized data from Netflix and data from IMDB, with full names. The research showed that by looking at the more obscure movies they could identify the user 84 percent of the time.

As well as refining the consumer data of firstworldmoviegoers, Big Data is enabling greater understanding of the needs and habits of those who have been poorly understood up to now. The exploding amount of data being produced is also coming to a great extent from developing countries. In 2010 there were over five billion mobile phones in the world of which over 80 % were in developing countries. In certain areas of the world where the telecommunication infrastructure is weak, mobile technology has become the preferred method for money transfers, job hunting, selling and buying items, looking up medical information – virtually everything.

Global pulse
All this activity produces a lot of data, and in 2009, the Executive Office of the United Nations Secretary-General launched Global Pulse, a Big Data initiative for tracking and monitoring the impacts of global and local socio-economic crises. Big Data can be used to help decision-makers gain real-time understanding of how different incidents impact populations in developing countries. One project is looking at online text content (blogs, news posts, social media etc) in Indonesia to search for indications or predictions of trends in the official Consumer Price Index. Another project investigates how social media and online user-generated content can be used to enrich the understanding of the changing job conditions in the US and Ireland, by analyzing the mood and topics of online conversations. Global Pulse has identified three main opportunities of Big Data for global development:

  • Early warning: early detection of anomalies can enable faster responses to populations in times of crises
  • Real-time awareness: Fine-grained representation of reality through Big Data can inform the design and targeting of programs and policies
  • Real-time feedback: Adjustments can be made possible by real-time monitoring of the impact of policies and programs.

Kenth Engø-Monsen works as a senior data scientist in the telecommunication company Telenor. During the last year, Telenor has partnered up with Global Pulse and will collaborate by providing analysis and data for their projects.  Engø-Monsen sees great value in projects like these: Telenor collects large numbers of data, he says, but being a commercial enterprise, Telenor uses this data for advertisements and product development. But seeing that the same kind of data can be used in humanitarian projects is inspirational, he smiles.

Telenor is cooperating in several projects with Global Pulse and Harvard University. One of the projects is about understanding how human movements affect the outbreak and spread of diseases in a country in Asia. “By mapping how humans move by looking at the activity on their mobile phones we can identify some patterns”, he explains. “Together
with health data from the local health care system and epidemiological models we see how the disease spreads and creates outbreaks in different places in the country. The project’s aim is to identify the areas in which the government should put in the most measures when trying to prevent the disease from spreading.”

A similar project that was done in Kenya some years ago presented their results in Science magazine in 2012. By studying data on mobility and health, the researchers discovered that the area around Lake Victoria was one of the most active ‘hubs’ in the transmission of malaria. Based on this information, the researchers recommended that government measures would be most effective in this area. Eliminating malaria here would lead to fewer outbreaks in other areas.

Analyses like these have quite a lot in common with how the police view the possibilities of Big Data; the combination of datasets over time can create possibilities for prediction. Hot-spot maps for disease transmission work in the same way as crime hotspot maps – decision-makers are able to extrapolate information on where their measures might be most effective.

“Data from mobile phones and our tools to collect and analyze them are so fast that we can actually update the information daily, even hourly,” says Engø-Monsen. “This is something quite revolutionary, and can help decision-makers be much more effective than today. Previously one had to wait several months before information had been collected, processed and analyzed. Now, one can get information that is almost real-time. In cases like the outbreaks of dangerous diseases or other humanitarian crisis this could be of great help.”

Global Pulse has a set of privacy and data protection principles that they follow when collecting, analyzing and storing data. These are based on a number of global legal instruments dealing with privacy and data protection. Respect for individual privacy forms the cornerstone of Global Pulse’s work. In addition, every data provider usually has their own privacy officer that ensures that the company follows both internal and national privacy regulations.

How do Telenor, who collect a lot of personal information about their customers ensure privacy when using this data in projects like Global Pulse? “Telenor is very aware of the sensitive nature of the information we collect,”states Kenth Engø-Monsen. “All data is anonymized carefully and any personal information like a phone number or name is removed before the datasets can be used in the project.”

Location is key information in these projects, as mobility patterns are the cornerstone of the analysis. “We never use the location of individuals in the datasets”, Engø-Monsen explains. “What we do is aggregate the data. This means that we look at the movements of larger groups – never the individual.”

A new privacy concept?
Technological development and our digital habits have changed the context of privacy. The amount of personal information out there is so much bigger than even just a decade ago and needs to be defined and protected in new ways. Robindra Prabhu suggests a new mathematical approach called differential privacy as a possible solution. “Differential privacy is privacy by design in a new way Rather than explicitly removing sensitive information from the dataset, this approach seeks to build privacy protecting measures into the operations performed on the data. When a user performs an operation on the data, the privacy mechanism kicks in. This is done by installing a digital guard between the database and the user that ‘blurs’ the answers in a way that keeps the sensitive information hidden without diminishing the value of the output.”


Differential privacy
Microsoft is one company that has started to look into differential privacy. In a white paper they explain the maths behind it.   
Differential privacy is a technology that enables users to extract useful information from databases containing personal information and, at the same time, offers strong individual privacy protections. The seemingly contradictory outcome is achieved by introducing relatively small inaccuracies in the answers provided by the system. These inaccuracies are large enough that they protect privacy, but small enough that the answers provided are still useful.
The user will never get direct access to the database. Instead, a piece of software is put between the database and the researcher. When the user asks for information in the database the software adds ‘noise’ to the answer, so that it does not reveal any personal information. Imagine a user who wants to know how many citizens in a city suffers from a certain disease. If this number is one, it could be very easy to identify this one person. The software then adds some ‘noise’ and could give the researcher the answer of 1, 0 or even -1. The user knows that some kind of noise has been added, and can draw the conclusion that very few people in this city have the disease, without knowing the exact number.
Microsoft Corporation 2012: Differential privacy for everyone


Big Data analysis offers huge potential for private companies giving them more information about their costumers and their preferences and so helping them design more profitable products and services. But can this technology be used in policy-making, and how?

This May, Neelie Kroes, a Vice-President of the European Commission, and European Commissioner for Digital agenda, stated, “Knowledge is the engine of our economy. And data is its fuel.” She argues that better data will provide the public sector with services that are more efficient, transparent and personalized. In addition, data can empower citizens by giving them more information and knowledge.

One of the few governmental areas that have started to look into Big Data is the security/intelligence field. But the lack of transparency and openness in these organizations contributes to keep Big Data as something mysterious and threatening. Seeing the work done in the UN project Global Pulse gives us more hope of how these techniques can be implemented in a much broader way in society.

Kenth Engø-Monsen explains how they inform decision-makers through Global Pulse.

“When developing new technology or new methods there is always a technology component and a market component. Because Global Pulse is a non-profit humanitarian project we get a chance to test the technology component without thinking about the market component. We can focus on developing technology and analytical models that actually work.
When we have a finished, successful project, Global Pulse and the UN can take the technology and the results, show it to decision-makers to prove that this is something useful. Global Pulse helps to bridge the gap between data providers and telecommunication companies on the one side and governments and decision-makers on the other side.”

For commercial purposes, it might be sufficient to use Big Data to see correlations. For governments, it is important to also look at the causality in order to react to the analysis in a proper way. When it comes to policy-making, we need to see the people behind the data.


Read more

Global Pulse is an innovation initiative launched by the Executive Office of the United Nations Secretary-General, in response to the need for more timely information to track and monitor the impacts of global and local socio-economic crises. The Global Pulse initiative is exploring how new, digital data sources and real-time analytics technologies can help policymakers understand human well-being and emerging vulnerabilities in real-time, in order to better protect populations from shocks.

The mission of PredPol is simple: place officers at the right time and location to give them the best chance of preventing crime. The PredPol tool was developed over the course of six years by a team of PhD mathematicians and social scientists at UCLA, Santa Clara University, and UC Irvine in close collaboration with crime analysts and line level officers at the Los Angeles and Santa Cruz Police Departments.

Big Data gets personal – Technology Review’s special on Big Data
Big data and personal information are converging to shape the Internet’s most powerful and surprising consumer products. They’ll predict your needs, store your memories, and improve your life—if you let them.

BIG – Big Data Public Private Forum
Building an industrial community around Big Data in Europe is the priority of this EU-funded research project, together with setting up the necessary collaboration and dissemination infrastructure to link technology suppliers, integrators and leading user organizations. The project wants to promote adoption of earlier waves of big data technology and tackle existing barriers as policy and regulation issues.

Books

Victor Mayer-Schönberger (2013): Big Data – A revolution that will transform how we live, work and think. (HMH)

Eric Siegel (2013): Predictive analytics. The power to predict who will click, buy, lie or die. (Wiley)

Phil Simon (2013): Too big to ignore. The business case for big data. (Wiley)


Text: Marianne Barland

Photos: Barcelona Supercomputing Center, Birgitte Blandhoel, iStockphoto

]]>
Making Perfect Life? http://volta.pacitaproject.eu/megatrends-in-bioengineering/ Mon, 08 Apr 2013 15:49:09 +0000 http://volta.pacitaproject.eu/?p=922 The blurring boundaries between biology and technology
Biology and technology are merging in many more ways than ever imagined. An ambitious technology assessment study, Making Perfect Life, has delved deeply into the changes this might entail. How can policy makers ensure these are mostly for the good?

Whatever lives is fundamentally different from dead matter, or so we feel. As 21st-century citizens of industrial societies, many of us still believe that things that breathe are more sacred than things that do not. We may know that organisms are complex systems that can be understood in terms of their components and processes. But deep down, many of us feel that within the humblest bacterium there is an indefinable something that has so far escaped our analysis. How can the wet stuff of animate beings and the dead stuff of human-made contraptions merge seamlessly into one?

Yet that is just what is happening in labs around the world. Scientists and engineers are increasingly blurring the boundaries between organic creatures and mechanical creations. Four technosciences are involved, each of them offering their own perspective and technological contribution. It’s known as the NBIC convergence – Nanotechnology, Biotechnology, Information Technology and Cognitive science. To span the organic-mechanical gap, bridges are being built from both sides.

‘It can be very hard sometimes to see the difference between a machine and a living being.’

“The life sciences are approaching their subject in a more and more technological way,” says Rinie van Est (Rathenau Institute, The Hague), project leader of Making Perfect Life. “Biologists are no longer content to study or even manipulate organisms, but are bent on building new ones from scratch. In terms of their ambitions, that’s nothing less than a revolution. Equally novel is how the physical sciences are nowadays drawing inspiration from the functioning of living things and trying to imitate them. The Blue Brain Project, for instance, aims to replicate the entire human brain in a computer system, the hope being that with a full understanding of its functioning, we will be able to build more powerful and efficient computers.”

Informed by this observation, the project team chose ‘biology becoming technology, technology becoming biology’ as their guiding concept. “These are two megatrends that will have a huge impact on the future of our society”, says van Est. “And in a way, the two are really one, because they reinforce or even help to create each other. It reminds you of the famous Escher picture of the two hands. Together, the trends represent a new engineering approach to life.”

Controversial science
Along with nuclear power, biotechnology has been one of the most consistently controversial fields of scientific endeavour in recent times. Even today, techniques such as genetic engineering and cloning leave many people uncomfortable and are subject to restrictive regulation, especially in Europe. Will the new megatrends of biology becoming technology and vice versa lead to a similar sense of unease and to new bans and guidelines? “There’s certainly reason to discuss the foreseeable consequences of the innovations very carefully,” according to van Est. “That’s exactly the debate we are hoping to have initiated with Making Perfect Life. And it’s not just that, say, medicine will be able to cure more diseases – that would be relatively straightforward. Several of the new technologies are already being eyed up by other industries. We call that ‘a change in social practices’, and though there’s nothing at all wrong with it per se, it does raise a whole range of new regulatory issues.”

So much for the general concepts – let’s get down to the nitty gritty. The researchers have distinguished four main areas where the megatrends of bioengineering are playing out. They’ve labelled them intelligent artefacts, living artefacts, interventions in the body and interventions in the brain.


Making Perfect Life is a study commissioned and funded by the European Parliament, as a project of the Parliament’s Science and Technology Options Assessment (STOA) Panel, under the responsibility of Malcolm Harbour and Vittorio Prodi, MEPs. STOA contributes to the debate on strategic scientific and technological issues of political relevance and the policy options for tackling them through projects of a medium to long-term, interdisciplinary character, as well as information and dialogue activities, whose outcomes are relevant to the European Parliament in its role as legislator.


Intelligent artefacts are machines that have certain lifelike qualities without being alive in any biological sense. They have chips, not genes; they have metal and plastic components, not tissues. This is worth keeping in mind, because otherwise claims and fears can easily be overblown. These artefacts are equipped with sensors to register a variety of signals, especially those emitted by human bodies. Our good old organic senses already have synthetic counterparts. Several sorts of sensors have been around for a while, but they still illustrate the trend of technology-becoming-biology. Human signals registered include sound, as in speech, grunts and squeaks, and mechanical signals, commonly known as movement and body language. Other examples include chemical, electrical and thermal signals.

Biologists are no longer content to study or even manipulate organisms but are bent on building new ones from scratch… Illustration: Petit Comitè.

From these observations, computer software determines what we are ‘doing’ and how we are ‘feeling’. An appropriate response is calculated, which could consist of words, actions or a rudimentary display of ‘emotions’. If all goes well, the human partner will indeed perceive these as adequate. (Sometimes, however, not all goes well: with more than one person in a room, systems can get individuals mixed up.) When these digital skills, such as they are, are uploaded to an ambulant device, you’re looking at a smart robot. When the skills are integrated into our living environment, the result is known as ambient intelligence. In both cases, we’re dealing with artefacts that are more interactive and closer to human than we’ve been used to so far.

With machines that respond adequately and in real time to our individual physical and psychological state, we already seem to be fulfilling the prediction that animate beings and inanimate contraptions will merge seamlessly. But an even smoother human-machine interface has hit the labs: neurophysiological computing. Here, just one thing is measured: patterns of brain activity. From these, emotions can be inferred. This is the technology steering thought-controlled wheelchairs, and which has also captured the imagination of computer games manufacturers and users. These new, intimate links between humans and machines are expected to find applications in three fields.

Sensitive interaction
The first of these are computers skilled in sensitive interaction, enabling them to take the place of human communication partners. They can function as extremely patient teachers, constantly eager computer-game adversaries or highly accurate doctors or nurses. Don’t be surprised when you see such systems appear and spread in e-learning, gaming and health care some time soon, because they’ve already reached the clinical testing stage.

But will we be comfortable with these ambiguous ‘beings’ around us, that are lifeless and lifelike at the same time? “It can be very hard sometimes to see the difference between a machine and a living being,” says Brigitte Krenn, an Austrian researcher of artificial intelligence. She gives a relatively low-tech example: “I know of one case where an old lady in a home for the elderly was confused by an emergency call coming from the intercom system – she thought a voice was talking to her through the television.” But it’s not just the elderly. Who can truthfully claim never to have been wrong-footed by a synthetic telephone voice, mistaking the speaker for a flesh-and-blood person? Machines are becoming more human. Krenn suggests that what is going on inside a machine should be visible on the outside – a hardware equivalent of the ‘what you see is what you get’ concept in software.

Philosophers are not afraid to raise questions that are simultaneously naive and profound. So, “Do we need all these new applications?”, wonders Jutta Weber (Technical University, Braunschweig), “especially when not everybody knows how to handle them?” She believes we need to give people a better technical education in using computers and applications first. Instead of just inventing things (and the list of innovations that never caught on is a long one), engineers should ask people what they need in their daily lives. This is not to suggest that engineers are attempting to push any old innovation down society’s throat, but that user needs and preferences should be taken into account at an early stage.


‘Biology becoming technology’ implies and promises new types of interventions which further enhance the manipulability of living organisms, including the human body and brain. It is illustrated by ‘top-down’ synthetic biology, molecular medicine, regenerative medicine, forward engineering of the brain and persuasive technology. The physical sciences (nanotechnology and information technology) are enabling progress in the life sciences, like biotechnology and cognitive sciences, creating a new set of engineering ambitions with regards to biological and cognitive processes, including human enhancement, so that in the future genes, cells, organs, and brains, can be bio-engineered in much the same way as nonliving systems, like bridges and electronic circuits.


The ‘technology becoming biology’ trend embodies a (future) increase in bio-, cogno-, and socioinspired lifelike artefacts, which will be applied in our bodies and brains, be intimately integrated into our social lives, or used in technical devices and manufacturing processes. These (anticipated) new types of interventions and artefacts present a new technological wave that is driven by NBIC convergence. It is illustrated by ‘bottom-up’ synthetic biology, the shift from repair to regenerative medicine, reverse engineering of the brain and the engineering of living artefacts. This , after future development relies heavily on so-called biomimicry or biomimetics: learning from the achievements of nature (though there’s room for improvement).


The second application of machines with human-like interactive skills is the benevolent personal supervisor. Computers that monitor how we feel (fit or tired, alert or drowsy, amused or bored) and that are capable of intervening when we fall into an undesirable state. When our bodily signals cross some predetermined threshold the computer will take action. Systems that alarm car drivers that are dozing off are already on the market, and are bound to spread to other types of travel. Bored with a computer game? The manufacturer will want to measure when that happens too.

Ambient intelligence
The third application occurs when a computer is integrated into an everyday residential environment. Here, the user interface becomes as good as imperceptible. In ambient intelligent applications, sharp sensors are built into the living space – artificial eyes, ears and noses. Early efforts have been concentrated at environments for the elderly and infirm, to enable them to lead more independent lives. Ambient intelligence is likely to figure in other areas including ‘intelligent’ homes, health care and support for the disabled, as well as industry and business. Brigitte Krenn’s misgivings about machines posing as humans are relevant again, as is Weber’s question about the appropriateness of new applications.

‘People can feel that they are losing control of information which actually belongs to them. The public trust here is very fragile’

All these smart systems raise other awkward questions. It’s important to realise that they cannot function without collecting massive amounts of personal data about the users and their thoughts and actions. An incredible amount of detailed information on the user’s actions, thoughts and emotions is now available, and the question is, to whom? Who should have access to this sensitive data? It’s certainly the sort of information that will interest many parties; knowing what people do, think and feel is the ultimate dream of any marketer, not to mention certain actors in totalitarian states.

“Our personal privacy is very much being affected,” warns legal expert Judit Sándor of the Central European University in Budapest. “People can feel that they are losing control of information which actually belongs to them. The public trust here is very fragile. We need to think about these issues on the long term.” By way of a practical solution, Michael Rader of the Institute for Technology Assessment and Systems Analysis in Karlsruhe suggests introducing a system of licensing and procedures to control the data. “We have enough experience to develop them, but at the moment we are lagging behind…” [European regulators reading this, get your yellow markers out.]

Equally awkward is what might be termed the fallibility issue. On the basis of information from sensors, these applications draw conclusions about people’s moods, deeds and needs and take action accordingly. But their conclusions and actions are only as good as their software, which in turn is so complex that it is utterly impossible for programmers to predict how it will respond to every single eventuality. What happens when the computer makes the wrong decision? When dealing with a vulnerable person, serious or even fatal harm, is a possibility. Human actors also make mistakes of course, so it might be argued that as long as the machines do no worse than we do, no ‘net harm’ is done. But who is responsible for these ‘automated mistakes’. Is it the manufacturer, or should the finger be pointed at the operator? European regulators will have to figure out what is just and practicable here.

‘Strong negative feelings among the general public are never far away, and metaphors such as ‘playing God’ and ‘Frankenstein’ —however clichéd— have lost little of their rallying power.’

The second main area of bioengineering aims at modifying existing or even building new, mostly very small, life forms. Unlike ‘traditional’ biotechnology, engineers have set their sights on creating these from scratch. In practice, there is a continuum, with species being genetically altered at one extreme and entirely new ones being crafted at the other. Reading from left to right as it were, the ‘biology-becoming-technology’ trend is well-established with the number of newly introduced, artificial components and processes increasing and the number of natural components and processes becoming fewer. At the extreme right, where we see the opposite trend, the ‘technology-becoming-biology’ process is still in its infancy.

Cellular chassis
It is believed that in the future, the young discipline of synthetic biology will use synthetic genes as tools to transform cells into biological factories or agents with a highly artificial nature, based on a so-called minimal genome as a cellular ‘chassis’. The long-term ambition is to create ‘proto-cells’ that would be self-sustaining and self-duplicating, starting from non-biological molecular building blocks. Useful features could then be grafted onto these proto-cells, or so the reasoning goes. At this point in time, however, it is extremely difficult to assess the potential of synthetic biology.

Obviously, the traditional worries about biotechnology also pertain to these developments.

Strong negative feelings among the general public are never far away, and metaphors such as ‘playing God’ and ‘Frankenstein’ – however clichéd – have lost little of their rallying power. It is not just the general public; ethicists are also struggling with the issues raised by ‘creating life’. With bio-engineering becoming ever more ambitious and possibly more potent, a new bio-debate seems in order. European politicians have a choice. Should they stimulate public debate in order to develop societal standards for living artefacts, which may result in a cautious acceptance or outright rejection of synthetic biology? Should they leave the fate of synthetic biology to market forces, hoping for more ‘under the radar’ introduction but risk a public outcry and loss of credibility later on? In a democracy, the question should be a no-brainer.

Goggling beyond Google: How will we be seeing things in the future? Photo: Gettyimages.

Other policy choices are of a more technical nature. Are the safety standards for biotechnology adequate for synthetic biology, or should new approaches be adopted? Does synthetic biology require special regulation of intellectual property rights to ensure a healthy balance of open access and protection? Should Europe stimulate the establishment of technical standards in synthetic biology, to help European players catch up with the now-dominant US?

‘Old’ biotechnology used to be about things like genetically modified crops and cloned farm animals. With bioengineering, we are now targeting our own species. The human genome was first mapped back in 2000. It is expected that within a few years, it will be possible to sequence the entire genome of any individual in a matter of days (and at well under a thousand euros, a not excessive cost). This is the current frontier for research: squeezing meaning out of the raw data that results from whole genome sequencing This is where biology is, yet again, becoming technology. Once the billions of As, Cs, Gs and Ts can be confidently interpreted, it will be possible to predict – among other things – the diseases an individual is prone to, and even to establish which treatment is best, given the rest of the person’s genetic makeup. Personalised medicine is the name of this game.

The opposite movement, from technology to (human) biology, can be observed in current transplantation practices. While artificial implants, such as heart valves, have been commonplace for decades, the future might see implants manufactured on the spot by a three-dimensional printer. Artificial blood vessels are likely early candidates for this technology.

Transplantation medicine also provides us with an illustration of the Escher-like two-way movement: after the biological surprise discovery that the cells of the human heart are capable of dividing after all, a technology was developed for cultivating new heart tissue on the basis of the patient’s own cells. The method is being tested on pigs first. Stem cell technology presents another example. These forms of regenerative medicine hold considerable promise in terms of curing diseases and lengthening human life.

‘Near-term policy challenges and regulatory questions are already imminent’

But privacy is once again, a huge issue. As with the use of intelligent artefacts, which allow the storage of an immense amount of data about an individual’s thoughts, feelings and actions, sequencing genomes could leave people feeling completely exposed with nowhere to hide. The ability to interpret the genome will increase over time, so that seemingly meaningless data will reveal more and more. How do we deal with the DNA material and other personal data which will become available in the coming years?’ According to Bärbel Hüsing of the Fraunhofer Institute for Systems and Innovation Research in Karlsruhe, more guidelines and standards are needed: “With the expected increase of data exchange and internationalisation, the biomedical field needs to adapt a code of conduct on how to share these data. What level of confidentiality is needed here? How are we going to handle biobanking and personalised medicine? My personal view is that more international harmonisation of regulations by the European Union is desirable.”

While a number of specific developments were studied in the project which exemplify the major trends in four fields of bioengineering, the purpose was also to alert politicians. Despite the long-term character of the megatrends, near-term policy challenges and regulatory questions in these specific developments are already imminent.

Two questions frequently crop up in discussions about new medical technologies. The British Conservative MEP Malcolm Harbour voiced one of the most fundamental: “When we discuss the issues of prolonging human life, the question remains: how far do we want to go?” The other question is equally fundamental and unavoidable: how much is society willing to pay for health care to cover an ever longer life? When this willingness reaches its limits, how do we deal with the inequity that arises when the rich can afford treatments that the rest of us don’t have the money for?

Interventions in the brain
If you feel queasy thinking about someone manipulating your brain cells, brace yourself for what’s coming next. Your brain, the delicate organ where many of us feel our inner self sits enthroned, may not only get copied, but in some individuals, is already being regulated by technical devices. In The Blue Brain Project, technology is currently aiming to emulate biology. Although not all specialists in the field believe the idea of using computer simulations to understand cognitive functionality to be feasible or even particularly promising, the whole idea would have been inconceivable not very long ago because of sheer lack of knowledge about the brain. First, it was experiments on animals that yielded a good deal of information. And now more direct knowledge of the human brain is being gleaned thanks to diagnostic and therapeutic technologies including several types of brain imaging and stimulation.

Three of the major technologies here are Deep Brain Stimulation (DBS), Transcranial Magnetic Stimulation (TMS) and EEG neurofeedback. The aim of these technologies is (for now…) therapeutic: they are used for Parkinson’s disease, severe depression and ADHD, respectively. The use of these technologies is likely to be extended. For one thing, further therapeutic applications are being investigated, e.g. against epilepsy in the case of EEG neurofeedback. For another, it is very likely that healthy (or ‘neurotypical’) people could also benefit from neuromodulation by having their mood or cognitive performance enhanced, while EEG neurofeedback might well come to play a role in gaming.

This is where things get interesting from a regulatory perspective. The existing regulations for neuromodulation were drawn up exclusively for the medical domain, under the assumption that the devices would be operated and maintained by qualified personnel. But once new technologies get in the hands of less-qualified operators or ordinary consumers, new requirements are needed to keep users out of harm’s way. Even with trained personnel in place, there have been cases which should raise alarm signals. EEG neurofeedback has caused anxiety and insomnia; TMS can sometimes lead to hypomania, headaches and hearing loss.

‘Bioethics is ultimately toothless without biopolitics’

Once these technologies ‘get on the loose’ in society, there is an evident regulatory gap in urgent need of filling. It won’t do just to routinely declare them applicable either, because in other domains, the circumstances of use, the needs and even the risks may be different. Rather than wait for the devices to get on the market, politicians should consider the regulatory framework while these products are still under development. It’s important to note that this is not only true for medical applications of neuromodulation. All of the other technological trends described above could spread to new, unexpected fields, such as gaming, surveillance, nursing and forensics, to name but a few. Regulators are well-advised not to take a complacent or wait-and-see attitude.


The Making Perfect Life project

Research

Research has been carried out since 2009 by four member-organisations of the European Technology Assessment Group (ETAG):
• Institute of Technology Assessment (ITA), Vienna (Austria);
• Rathenau Institute, The Hague (Netherlands)(project co-ordinator);
• Fraunhofer Institute for Systems and Innovation Research, Munich (Germany);
• Institute for Technology Assessment and Systems Analysis (ITAS), Karlsruhe (Germany).

What makes it special?

• By looking at interlocking bioengineering developments it brought to light deeper trends than would otherwise have been possible.
• It examines how technological developments could spread beyond their traditional fields of application rather than mapping expectations of where technology is heading and listing problematic aspects.
• It analysed how the new technological wave is challenging the existing way of governing science and technology at a European level and the governance challenges of 21st century bio-engineering.


Speaking as a vice-chairman of STOA (the Science and Technology Options Assessment unit of the European Parliament), MEP Malcolm Harbour notes: ‘The task for STOA in the European Parliament is now to disseminate the conclusions of this wide-ranging and complex study, and to focus the findings on relevant policy issues. Now that the STOA secretariat forms part of the Parliament’s wider directorate on Impact Assessment and European Added Value, this should help ensure a joined-up approach to policy evaluation and new policy initiatives.’

By weighing up the ethics of bioengineering, we can address issues for the benefit and protection of ordinary Europeans, but it’s up to politicians to actually do it. As project leader van Est puts it: “Bioethics is ultimately toothless without biopolitics.”


Further reading

Van Est, R. & D. Stemerding (eds.) (2012) Making Perfect Life: European governance challenges in 21st century
bio-engineering – Final report
. Brussels: European Parliament, STOA.

Van Est, R. & D. Stemerding (eds.) (2012) Making Perfect Life: European governance challenges in 21st century bio-engineering – Study summary. Brussels: European Parliament, STOA.

Van Est, R. et al. (2011) Making Perfect Life: Bio-engineering (in) the 21st century – Monitoring report. Brussels: European Parliament, STOA.


Text: Gaston Dorren.

Photos: Gettyimages, Petit Comitè.

]]>
Finding Nano http://volta.pacitaproject.eu/anlsdfkansl/ http://volta.pacitaproject.eu/anlsdfkansl/#respond Tue, 20 Nov 2012 16:58:40 +0000 http://volta.pacitaproject.eu/?p=755
Can public cynicism about food technology be overcome?

The trillion-euro food industry is keeping quiet about its nanotechnology research but, regulated or not, products will be coming to a fridge near you. Is that steak trying to tell you something?

 

‘ I think the more information they give us the more we’ll trust them.’

Longer shelf life, intelligent packaging, and healthier or ‘functional’ food carrying medicines or supplements are among the possibilities offered by nanotechnology in the food sector. But the food industry itself remains secretive about how nanotechnology is being used which is raising the fears of EU citizens. Recent European TA studies stress the importance of transparent and credible information on nanoproducts. The need for information with regard to individual concerns and perceived risks should be taken seriously.

This spring, the FDA (U.S. Food and Drug administration) issued new draft guidance on the use of nanotechnology in food and food related products. The uncertainties related to nanotechnology in food are many and the FDA wants manufacturers to consult them before putting a product on the market. It was a move welcomed by health and environment campaigners: “The agency is no longer ignoring the scientific consensus that these nanomaterials have the capacity to be fundamentally different, and can create new and novel risks, necessitating new testing,” stated George Kimbrell of the Campaign for Food Safety. By  identifying nanotechnology as one of their main priorities, the FDA has sent strong signals that this is something they see as highly relevant in the years to come and taken the discussion on the use of nanotechnology in food in the US to another level. We know that nanotechnology is already used in some food related products. Is it time to speed up the discussion in Europe?

Nanotechnology in food and food related products has only recently taken its first few steps into the consumer world. While new products are being released  every day, it’s not yet the world of Willy Wonka and a three-course-meal on a stick of chewing gum. The food and beverage category in the Nanotechproject’s Consumer Products Inventory returns over a hundred items. These include antibacterial kitchenware and storage products and utensils, but also edible products and food supplements. There’s Slim Shake Chocolate from Nanoceuticals, for example, described as ‘a technology advanced form of cocoa that offers enhanced flavor without the need for excess sugar’. Or Chinese NanoTea, which: ‘can release effectively all the excellent essences of the annihilation of viruses through penetration so that a good supplement of selenium can be achieved and the selenium supplement function can be increased by 10 times.’

Nano benefits?

In fact there are many proposed ways that nanotechnology could improve our food. Fighting obesity by reducing the amount of fat and sugar in our food is one. Personalized food that could adapt to the dietary needs of people with allergies or taste preferences is another. The technology can also be used in packaging and wrapping to improve the shelf life of food. These are positive outcomes that one could hardly disagree with. But there are also certain risks related to the use of nanotechnology. When materials and particles are manipulated on a very small scale and take on new properties, it is difficult to know for certain how the body or the environment will react. Because of these uncertainties, the introduction of nanotechnology in consumer products has been cautious, and the precautionary principle has been a guiding principle in implementation of nanotechnology. This states that if an action or policy has a suspected risk of causing harm to humans or the environment, the proof that it is not harmful falls on those taking the action.


What is nanotechnology?

Nanotechnology is technology that operates on the nanoscale (one billionth of a meter). Particles at this scale exist in nature (for example salt particles from sea spray or protein particles in milk), but the development of nanotechnology enables scientists to manipulate matter at the nanoscale; so small that it cannot be seen with a regular microscope. We can use nanotechnology to reveal new properties in different materials, also in the area of food.


As one of the biggest industries in the world, the food sector is technologically advanced. From early 2000 until 2005, nanotechnology became a buzzword; it communicated innovation and forward thinking. But after some time of ‘buzzing’, the media started digging a bit deeper and wrote more and more about the proposed risks that could be related to the technology. This made the public more sceptical, and products with the word nano in their name disappeared from the shelves.

This can be illustrated by the case of Kraft Foods. In 2000, as one of the biggest food companies in the world, Kraft Foods proudly announced their very own project on nanotechnology – the Nanotek Consortium. It involved 15 universities all over the world and several national research laboratories. Presenting  themselves as frontrunners in the development of nanotechnology in the food sector, Kraft Foods researched the use of nanotechnology both in packaging and in food itself.

After some years of activity, the consortium was renamed ‘The Interdisciplinary Network of Emerging Science and Technologies’, and passed over to Phillip Morris. Mondelēz International (which now owns the brands of Kraft Foods) no longer fronts the development of nanotechnology in the food industry, but has a short text on their website:

“Currently we’re not using nanotechnology. But as a leading food company, we need to understand the potential this technology may hold for us in terms of food safety, product quality, nutrition and sustainability. That is why our research and development teams always keep their eyes on the scientific research, as well as consider potential applications where nanotechnology may be used in packaging material.” (Source: Mondelezinternational.com.)


Nano and TA

The huge promises from the research and food industry combined with the fears communicated by NGOs makes nanotechnology a prime topic for technology assessment, says Adrian Rüegsegger, project manager at TA-SWISS, the Swiss centre for technology assessment. The prominent role of the food industry and food research in Switzerland was one of the reasons they commissioned a study on nanotechnology and food in 2009. “In this specific case more insight was  needed, since many studies focused more on nanotechnology at large and less on the particular use in the food sector. By taking an interdisciplinary approach, technology assessment looks at both opportunities and risks, taking into account not only the technological challenges, but also the societal, ethical and  regulatory aspects” comments Rüegsegger.

Wrapped in nano

The industry is currently keeping quiet, and no longer communicates its actions when it comes to nanotechnology, but it does not mean it is not active in  development terms. “We can already find nano products in storeswithin the area of packaging and wrapping”, says Frans Kampers, coordinator of Wageningen Bionanotechnology Centre (BioNT), a research centre active in the fundamental science and technology of micro- and nanosystems and their applications in food and health. “Providing better and safer food for the consumer is the overall goal of these developments”, he continues. “A basic use of nanotechnology in this area could be to change the barriers of packaging; the food will be less affected by, for example, sunlight or the leak of gases through the wrapping.”

One example of this is the American brewery Miller Brewing. Some years ago they wanted to change from glass to plastic bottles. Because of their weight, plastic bottles would be much cheaper to transport. But, it turned out the new plastic bottles were not able to keep the beer fresh as gas leaked through the bottles. Using clay nano particles in the plastic, the barriers of the bottle strengthened and the beer now has a shelf life of up to six months.

Kampers is positive about the general possibilities nanotechnology offers: “Nanotechnology is an enabling technology with many applications. It is a toolbox with a very high precision level and can be applied in many areas, also in the food industry.”

Nanotechnology could also introduce us to the concept of “intelligent packaging”. Small nano sensors could be embedded in the food packaging to inform  consumers when food is starting to degrade, for example through a system of colors. The label will be green when you buy the product and turn to yellow when it only has a few days left before going bad. A red label shows that the food is not safe for consumption.

Implementing nanosilver in packaging to keep food free from bacteria is a technology that is already in use today. Kampers refers to research that shows there is little migration between the food and the packaging: “It seems that this application of nanosilver could be a good solution. If the silver particles stay in the packaging and don’t migrate into the food, the person eating the food will not measurably be exposed to the silver.”

And this is what it comesdown to: exposure.


Read More?

TA projects on nanotechnology
Several TA institutions have done or are doing projects concerning nanotechnology, within the food industry and also a wider context.

Governance of Nanotechnology in the Netherlands – Informing and engaging in different social spheres. Rathenau Instituut (2012)
Describes the wide range of activities that were organised in the Netherlands to bring a public perspective into the development of nanotechnology. Will be published in a special issue on public engagement in the International Journal of Emerging Technologies and Society (iJETS) later this year. Ten lessons for a nanodialogue. Rathenau Instituut (2008).

Nanomaterials : Effects on Environment and Health. TA-SWISS (2009)
An overview of commercial products which contain nanomaterials and an analysis of future trends.

Nanotechnology in the food sector. European Parliament (2009)
Comissioned by TA-SWISS and conducted by the Institute of Applied Ecology (Freiburg, Germany), a STOA (Science and Technology Options Assessments of the EU parliament) study which assesses products in respect of environmental issues and sustainability, showing the direction that future developments might take and where there is a need for caution.

Nanotechnology in the EU
Policy, research and actions on nanotechnology from the EU


 
Negative focus?

Andy Booth, a researcher at Scandinavian research institute SINTEF, is a specialist in engineered nanoparticles. “Wearing a silver ring on your finger is not seen as risky,” says Booth, “however, eating products that have been in contact with nano silver particles is perceived as something else.” Though he agrees there are certain risks connected to the use of nanotechnology, he feels the media has been biased in their writing. “We don’t have a balanced picture of nanotechnology. The focus is more or less always on the negative. Sure, there is a risk, but this is always related to exposure. The technology has so many possibilities that we should not kill it before we have assessed both the risks and the benefits.”

The TA-SWISS study Nanotechnology in the Food Sector (2009) found that food packaging modified by nanotechnology promised real ecological value – provided appropriate recycling systems can be set up. The effects of nanoparticles over the whole life cycle of a product must be taken into account, which means during the manufacturing process, in contact with the food, and in the case of packaging, when it is disposed of or recycled.

Healthy eating

One of the most positive prospects for nanotechnology in food is the potential benefits this could mean to our health. Being able to reduce the amount of salt and fat in food without affecting the taste or texture certainly appears enticing and could help in overcoming issues such as obesity. In the UK, Leatherhead Food  Research −whose working group NanoWatch has been running since 2007− have shown that the size of salt particles can affect taste. By using salt particles at the nano level, it would be possible to reduce the amount of salt and still get the same taste. Another ‘healthier version’ example would be making a low-fat  mayonnaise by manipulating the texture at the nano level, so that the product still tastes and feels as creamy as the full-fat alternative.

An application of nanotechnology that could be useful for special groups of consumers is varying the quantity of nutrients or vitamins in food. Some groups of people have dietary conditions that make it difficult to have a sufficient uptake of vitamins which are caused by allergies, diets or other conditions. Nanotechnology could help these groups to get the nutrition they need. These functional ingredients can also be designed into a delivery system, so that the ingredients reach the place in the body where they will be most effective, without degrading on the way. This kind of delivery system has also been introduced in the field of medicine to get the most effective use of certain drugs.

‘The technology has so many possibilities that we should not kill it before we have assessed both the risks and the benefits.’
 
Nano ‘meat’

Frans Kampers believes that nanotechnology could also make our meat consumption more sustainable. “In the future, meat and animal protein will be scarce. It will be impossible to produce enough meat if large populations, who until now have eaten less meat, start adopting the western lifestyle. The current way of  producing meat using animals is simply too inefficient. In some cases only ten percent of the plant protein is converted to meat. It would be an interesting  opportunity to use nanotechnology to make a meat replacement directly from plant protein. If it tastes and feels sufficiently meat-like, consumers will probably like it. Using a source of plant protein, scientist could manipulate the proteins already in the plant to make the taste and texture like the meat we know today.”

But what will our future meat look like? Professor Mark Post and his team in Maastricht University in the Netherlands revealed they were growing a hamburger in February 2012. Take some bovine stem cells and serum from an equine foetus and grow a few thousand strands of muscle. Hungry yet? Photo: iStockphoto.

 
Regulation

Scientists agree that there are a number of opportunities in the field of nanotechnology that could be beneficial for consumers and the society as a whole. Nano products, mostly related to wrapping and packaging, are on the shelves, but many more are on-going projects based in labs around the world. Regulating the use of nanotechnology and dealing with certain risks related to exposure will be of importance in the years to come. It will give the industry important guidelines and will also help educate and inform consumers.

In Europe it is the European Commission that regulates the use of nanotechnology and it is mindful of the importance of a solid framework:

“The EU has invested a great deal of money in research and development for nanotechnologies. It must now create the right conditions for realizing their full potential. The EU has decided to take an “integrated, safe and responsible approach” to the development of nanotechnologies. This includes: reviewing and  adapting EU laws; monitoring safety issues; engaging in dialogue with national authorities, stakeholders and citizens.

There are already laws regulating food safety, food packaging and novel food. How nanotechnology fits into these different regulations is more difficult. There isn’t one clear definition of nanotechnology or nanomaterials that everyone agrees on. This creates problems formulating laws and regulations, which again makes it difficult to label and register products. So even though we know there are products out there containing nanotechnology, it could be difficult to identify them by simply looking at the product labels. There’s no requirement for 'nanotechnology' to appear on the label. It could be stated but ‘hidden’ in a chemical description which makes it difficult for the average citizen to recognize.

‘People associate food with emotion and don’t want it to appear as something artificial. Knowing that something ‘secret’ is going on will create a negative attitude towards nanotechnology.’

Frans Kampers sees this definition debate as a dead end, especially for food products. “If you look at the current definition proposed by the European Commission and apply that to food, all food products will need to be labelled as nano,” he believes. It would therefore be much simpler to focus on those types of engineered nanomaterials that could be deemed hazardous. These are the persistent non-dissolving, non-biodegradable nanoparticles which can be defined and regulation can be based on such a definition.

Moreover, these materials can be detected, even in complex matrices like food, which is a prerequisite for enforcement of regulation. This is the same line taken in a recent report from STOA (Science and Technology Options Assessments of the EU Parliament). Nano Safety – Risk Governance of Manufactured  Nanoparticles (July 2012) argues that regulation should be limited to human activities; a legal definition of nanomaterials should therefore focus on manufactured nanomaterials.

Accepting the tiny technology

As nanotechnology becomes more widely distributed in a variety of consumer products, an increasing number of people are seeking information and expressing concerns about the safety of products containing or using nanotechnologies. In 2011, the Food Standards Agency in the UK researched citizens’ opinions on  nanotechnology and food through citizens Forums. Although nanotechnology is a complex area, and citizens find it difficult to assess because of its risks, in certain areas citizens were clear.

‘Citizens want governments to act on behalf of the public interest'

First of all, they want information about research and developments, potential risks and uncertainties, and the motivations of those involved in its development, to be made publically available. This request for greater transparency clearly contradicts the more introverted attitude we have seen from the food industry itself.

This matches the findings of STOA and TA-SWISS and the debate concerning genetically modified crops; citizens are more cautious about products if they  suspect that the manufacturers are not transparent about the constituents of their products. A proactive information policy and specific labeling could help prevent mistrust, says Adrian Rüegsegger of TA-SWISS. The fear of citizens that the ratio of potential benefits to potential risks is unfavorable was one of the reasons TA-SWISS wanted to do a study specifically on the food sector.

Frans Kampers concurs: “People associate food with emotion and don’t want it to appear as something artificial. Knowing that something ‘secret’ is going on will create a negative attitude towards nanotechnology.” People need to be educated on both the benefits and the risks, he says.

Another conclusion from the Food Standards Agency’s citizens forums was that they want governments to act on behalf of the public interest. Seeing the food industry as self-interested, they wanted governments to take a stronger position. However, they also wanted more citizen involvement when making decisions  about whether certain products were ‘worth the risk’ when it came to consumption.

During the workshops, participants developed their views as their knowledge about the issues grew. This shows the importance of educating consumers and having a transparent dialogue between the food industry, manufacturers and the government. When consumers know what’s going on and the motives behind scientific developments, they are more likely to accept new products.

Medicines (and vitamin supplements) using nanotechnology can be delivered more quickly to the bloodstream and have been used for decades. Photo: iStockphoto.

 

This method of involving citizens is well known from the area of technology assessment and was also included in the study made by TA-SWISS. Their 'Publifocus' on nanotechnology, health and environment in 2006 aimed at finding out how lay people perceived the debate on nanotechnology and where citizens saw  opportunities for themselves, their health and the environment. One of their findings was that, in general, people expect more opportunities than risks with nanotechnology – their hope outweighs their reservations, says TA-SWISS project manager Emiliano Feresin. But even when participants had a positive attitude they wanted more information and labelling of food containing synthetic nanoparticles.

Andy Booth knows that it is difficult for the average citizen to understand the complexities of nanotechnology, and consider (or make their mind up) about the risks and the benefits. “Nanotechnology is a huge field and it is difficult to discuss it as a whole” he confirms. “To say that nanotechnology is dangerous is the same as saying that all chemicals are toxic. Some products with nanotechnology are perfectly fine, but when we actually consume the product the exposure is completely different.”

Product safety is paramount

The STOA study concludes that information about the ingredients, functions and effects of nanomaterials in consumer products is required by citizens and consumer organizations. Product safety is paramount and the industry is expected to provide this information in a clear and understandable way, in order to enable the public to make an informed decision.

It may be needed sooner rather than later. An expert group from FAO and WHO identified 183 published patents containing the keywords ‘nano’ and ‘food’ in the period 2009–2011 indicating that there is a lot of research activity and probably several ‘near-ready’ products.

Nanotechnology can be revolutionary in many areas of consumption, but is of no use if it is not accepted by the public. If, in the future, we want the meat in our fridge to communicate with us, we will have to rely on the soundness of the science, industry and the governments that regulate the developments.


Nanowatching

SINTEF is the largest independent research organization in Scandinavia. SINTEF creates value through knowledge generation, research and innovation, and develops technological solutions that are brought into practical use.

Institute of Nanotechnology. The Institute works closely with governments, universities, researchers, companies and the general public to educate and inform on all aspects of nanotechnology. It also organises various international scientific events, conferences and educational courses that examine the implications of nanotechnology across a wide variety of themes and sectors.

The Project on Emerging Nanotechnologies was established in April 2005 as a partnership between the Woodrow Wilson International Center for Scholars and the Pew Charitable Trusts. The Project is dedicated to helping ensure that as nanotechnologies advance, possible risks are minimized, public and consumer engagement remains strong, and the potential benefits of these new technologies are realized.

nano&me is a website for anyone interested in nanotechnologies. The site aims to bring a balanced and thoughtful perspective to discussion about nano. Through these discussions a wide range of views can then be brought to the attention of government policy makers and any business and science using nanotechnologies. The website is made by The Responsible Nano Forum and the Together Agency of Nottingham.

Wageningen Bionanotechnology Centre (BioNT) is active in the fundamental science and technology of micro- and nanosystems and their applications in food and health. The centre wants to help companies utilize the opportunities of these new technologies to innovate their products and processes and to improve our food and prevent health problems.

More:

Institute of Technology Assessment. Austrian Academy of Sciences

European Technology Assessment Group on behalf of STOA

EPTA projects database


 

Text: Marianne Barland.

Illustration: Petit Comitè.


 

]]>
http://volta.pacitaproject.eu/anlsdfkansl/feed/ 0
What can be hacked, will be hacked http://volta.pacitaproject.eu/2-what-can-be-hacked-will-be-hacked/ http://volta.pacitaproject.eu/2-what-can-be-hacked-will-be-hacked/#comments Fri, 20 Jul 2012 07:27:38 +0000 http://volta.pacitaproject.eu/?p=247

 
Across the world, the number of cyber attacks on public and private critical infrastructure – assets that are essential to the functioning of our society – is growing. Little seems safe. Electricity grids, oil and gas plants, water supply systems, financial infrastructure, traffic management – they are all vulnerable. Hollywood fantasy is becoming reality.

Cyberspace is contested every day, every hour, every minute, every second

The year is 1995. In the movie The Net Sandra Bullock plays a reclusive software engineer who stumbles across plans by a secret organisation to dominate the world by breaking into critical computer systems. As she skirmishes with these mysterious Praetorians, she and her pursuers use computers as weapons, hacking into just about anything: power grids, Wall Street computers and airplanes. ‘Impossible’, many technology pundits pointed out at the time. Pure Hollywood fantasy.

These days The Net seems strangely prescient. More and more technology experts are convinced, just about anything can, and therefore will, be hacked including vital infrastructure systems. And those pesky Preatorians? Well, some argue they became reality too. Only they call themselves Anonymous. This group of anarchistic hackers is known for their successful attacks on civic, commercial and government sites to gain notoriety and inflict damage. In February this year, Operation Unmask was launched: an international initiative supported by Interpol which led to the arrests of 25 hackers from countries in Europe and Latin America. The group, aged from 17 to 40, are believed to have links with Anonymous. According to Interpol, the international arrests followed a series of coordinated cyber attacks against the Colombian Ministry of Defence and presidential websites, as well as Chile’s Endesa electricity company and national library. On internet forums and Twittter, Anonymous has vehemently denied it would attack critical infrastructures, calling suggestions like these ‘ridiculous’and ‘fear mongering’.

Global risk

©iStockPhoto

But the western world is vulnerable to online attacks, that much is clear. Earlier this year, the World Economic Forum listed cyber security as one of the five global risks to watch. In their Global Risks Report 2012, experts considered risks that have ‘severe, unexpected or underappreciated consequences’. The risk to critical systems failure that respondents cited most frequently was cyber attack. In the report, the WEF states: “National critical infrastructures are increasingly connected to the internet, often using bandwidth leased from private companies outside of government protection and oversight.”

How can terrorists and hackers harm or destroy critical infrastructure from the comfort and safety of their own sofas? Well, for one thing, the information is out there. There are many control systems that are accessible directly from the internet or that can be easily located through internet search engine tools and applications. “It is indeed possible to hack into critical infrastructure”, confirms Eric Luiijf, Principal Consultant at TNO, the Netherlands Organization for Applied Scientific Research. He’s been warning about this since 2002: “ICT is everywhere these days; my car has 120 processors on board. And if it can be hacked, it will be hacked, sooner or later. Even if you pay a lot of attention to security.”

Malfunction? Technical glitch?
Media reports abound. In March this year, the US Government Accountability Office (GAO) testified that at least four energy facilities have been hacked in the United States, two of them nuclear plants. As early as 2001 the Californian electricity grid was hacked, causing an outage in parts of the state. Closer to home, there are reports of multiple hackings into the Norwegian electricity grid. NASA admitted last March that hackers had broken into critical systems, including those that control parts of the International Space Station. To top it all, former US ‘cyber security csar’ Richard Clarke testified that the blueprints for the F35 Joint Strike Fighter Jet were copied by Chinese hackers breaking into Lockheed’s intranet, resulting in a serious breach of US national security. But this is just the tip of the iceberg, according to Luiijf. “Many cyber security incidents involving critical infrastructure are not properly identified as such to higher management. Moreover, organisations want to keep quiet about it to the outside. It is simply called a malfunction or a technical glitch.”

The Dutchman thinks that security is still not a primary concern in many organisations. Because of ‘ease of use’ considerations, protection of infrastructure against hackers is often minimal. Take a municipal water supply service that needs to install a new pumping system. “To manage it, they will probably get a remote access industrial control system. You can buy complete systems off-the-shelf at an industrial hardware wholesaler. And if that system has password protection, chances are the people installing it will not use it – to make it easier to access the system in the future.” The result is a weak link in the water supply chain, waiting to be tested by somebody. And it was. Recently hackers in the Netherlands took control of the pumps of a tropical swimming pool.“They were just playing with it, but it could have been a lot worse if they had malicious intentions”, notes Luiijf.

Too easy
After denials from manufacturers that their systems could be remotely controlled, Dutch TV-journalists broke into a pump station in Veere, a small community in Zeeland, warning the local authorities they could turn off the pumps and flood the countryside. In a separate incident they turned off the central heating of the national headquarters of the Salvation Army. The entry to both remote control systems was made possible through the internet and because of a very easy to guess password (‘Veere’). An IT-specialist hired by the journalists said on-camera that within ‘half an hour’ he could teach his mother how to hack into systems like these. “That’s how simple it is.”

Stuxnet
Flashback to Hollywood and John Travolta in Swordfish (2001). In this movie he forces a retired hacker to steal 10 million dollars (an accumulated government slush fund) from a bank. The money is destined for a secret government organisation called Black Cell which kills terrorists who have targeted Americans. Rebels and spies using cyberspace as a battle ground? It seemed farfetched in the year terrorists used real airplanes to launch an attack on the US.
And then nearly a decade later, in 2010, Stuxnet was discovered in a nuclear plant in Iran.

Duqu – the next Stuxnet?

 


In November 2011 security firm Symantec warned of the emergence of new malware called Duqu which contains code identical to that used in Stuxnet. It also targets Scada Systems used in power, water and sewage plants, oil and gas refining and telecommunications, but its purpose seems to be to gather intelligence for mounting future attacks. Symantec stated that Duqu infections have been confirmed in at least six organizations in eight countries (France, the Netherlands, Switzerland, the Ukraine, India, Iran, Sudan and Vietnam).

Stuxnet is powerful and complex malware – malicious software – that sabotages or spies on the type of computers used in industrial control systems.The worm, which is designed to attack Siemens systems, was discovered in several important SCADA-programs – those that control the operation of valves, pipelines and other industrial equipment – at the Iranian uranium enrichment facility at Natanz. According to the draft report Information and National Security by UK NATO rapporteur Lord Jopling, Stuxnet deploys two extremely complicated programming payloads to bomb the target’s operating system, causing damage to the centrifuges while blinding its systems to the reality of what is happening. Such is the sophistication of the Stuxnet code, analysts believe it was designed by the US and/or Israel or Russia to slow down the development of weapons technology in Iran. Whoever tried to thwart the Iranians, it worked. The centrifuge operational capacity at Natanz dropped by 30 percent after the incident.

Meltdown
Most experts agree that only nation states currently have the resources to sabotage a critical system of that nature but the emergence of Stuxnet suggests what is possible. From the World Economic Forum report: “A virus like Stuxnet could conceivably trigger a meltdown in a functioning nuclear power plant, turn off oil and gas pipelines or change the chemical composition of tap water.”

Stuxnet also showed the potential scale of fights in cyber space, and the gloves, it seems, are off. In the decade since Swordfish, hacking has become part of geo-political armoury, seen as being on par with conventional weapons. The American government has a doctrine that says as much:
‘When warranted, the United States will respond to hostile acts in cyberspace as we would to any other threat to our country. All states possess an inherent right to self-defence, and we recognize that certain hostile acts conducted through cyberspace could compel actions under the commitments we have with our military treaty partners. We reserve the right to use all necessary means—diplomatic, informational, military, and economic—as appropriate and consistent with applicable international law, in order to defend our Nation, our allies, our partners, and our interests.’
International Strategy for Cyberspace, The White House, May 2011

The Obama administration is also pushing for a three-year mandatory imprisonment sentence for attacks against critical infrastructure systems.

What is…
Critical infrastructure


Countries differ when describing what exactly constitutes a critical infrastructure, also called vital infrastructure. The most important element is that they are essential to the functioning of society. The EU definition is: The physical and information technology facilities, networks, services and assets that, if disrupted or destroyed, would have a serious impact on the health, safety, security or economic well-being of citizens or the effective functioning of governments.
Think electricity systems, gas and oil plants, water supply (drinking water, sewage), transportation and financial/governmental (IT) services.

SCADA


Supervisory control and data acquisition (SCADA) programs are also called industrial control systems (ICS). These are computer systems that monitor and control processes in industry, infrastructure, or facilities. More and more of them are becoming connected to the internet.

NATO’s new policy
On the military side, NATO – whose own networks are constantly under attack by hacktivists – was early to spot cyber security as a serious issue when it implemented a Cyber Defence Programme in 2002. Last year, NATO defence ministers adopted a new cyber defence policy, focusing on prevention and building resilience. In November 2011, in an opinion piece for The New York Times, NATO’s Supreme Allied Commander Transformation, French General Stéphane Abrial, wrote that cyber attacks are “among the most pressing and potentially dangerous threats to our collective peace and security.”

Abrial: “In discussing a hypothetical major attack, NATO leaders are often asked what circumstances would trigger a response under Article V of the Washington treaty — in other words, when would an attack against one be considered an attack on all? It would not be prudent to try to define exact tripwires in advance, or to tie our hands as to how we would react. But assuredly, the alliance would respond deliberately to any significant attack, adapting its reaction to the extent of the damage, the degree of certainty in attribution, the identity of the attackers and their perceived intentions.”

In the article, the NATO Commander states that civilian authorities in all 28 NATO member nations have the lead responsibility on cyber security. Abrial: “NATO is therefore working in support of whole-of-government approaches to cyber defence — led by civilian agencies in each nation — and with actors outside government. Key among those are commercial suppliers and the wider industrial base, since NATO-wide, 85 percent of critical infrastructure is in private hands.”

©iStockPhoto

Who hacks?
Professor Solange Ghernaouti–Hélie of the Faculty of Business and Economics at Lausanne University (Switzerland) is an international expert in cyber security and cybercrime. She has seen hacking become a weapon but acknowledges there’s no clear profile of those wielding cyber weapons. “There are all kinds of people who hack into critical systems’, says Ghernaouti–Hélie. ‘Think of 16-year-old boys who want to prove that they can. But also criminals who want to blackmail the owners of a system. And lately we see government agencies trying to generate chaos in another country. The internet is very busy with people trying to do harm.”

As said previously, Stuxnet was an unusual development both in the complexity of its code and the nature of its intended target. Sources in The Economist claimed that its designers must not only have had access to the target plant’s blueprints and a detailed knowledge of Siemens’s industrial-production processes and control systems, but also pointed to their use of four previously unknown Windows security-holes – known as zero-day-vulnerabilities – that are so valuable to hackers that they would not generally use so many in a single attack.

Thomas Rid and Peter McBurney of the War Studies Department at Kings College London believe that the more destructive a cyber weapon is, the more expensive and difficult it will be to produce, especially in terms of the intelligence needed about the target. As a consequence, such cyber weapons will be very specific, not easily repurposed, and unlikely to cause collateral damage. In a report on cyber weapons produced earlier this year, they concluded that: “The cost-benefit payoff of weaponised instruments of cyber-conflict may be far more questionable than generally assumed: target configurations are likely to be so specific that a powerful cyber weapon may only be capable of hitting and acting on one single target, or very few targets at best.” While Ghernaouti–Hélie agrees that hackers or terrorists are not yet knowledgeable enough to produce something as destructive as Stuxnet, there is danger in other collaborations: “We see more and more links between radicals and tech-savvy criminals, who do know how to penetrate a critical system. If your goal is to disturb and disrupt, hacking is an excellent way to reach your goal.”

And she believes hacking is developing into a powerful weapon that might force us to rethink current political conflicts. “Take the Israelis and the Palestinians. They hack each other on a daily basis. No amount of security is going to stop some of these hacks to be successful, because both sides are incredibly motivated. If you don’t solve the root of the problem – the conflict between the two states – you are not going to stop the relentless hacking.” Since that might not be on the cards – Israel and the Palestinians have been at each other for decades, for example – governments and companies have no other choice than to invest heavily in cyber security to keep their, and our, critical infrastructure safe.


One of the biggest problems is that security is often just an afterthought

As a result, security is now the single biggest software market. But even the best security is not a cure-all, according to Professor Bernhard Hämmerli, cyber security expert at the Lucerne University of Applied Sciences (Switzerland). Since so much of our society is now online, protecting each and every nook and cranny of our networked lives has become impossible. Hämmerli compares it to guarding an extremely long fence. Unless you have guards at ten meter intervals, somebody can (and therefore will) climb across. “The defender has to defend everything, the hacker can be specific. He can stake out a system for a long time and look for that one weak spot he needs to get in. To make it even more difficult, IT infrastructure is constantly evolving. You have constant updates, maintenance, new applications; each and every change you make to a system could render it more vulnerable to a breach of security.” And then there is the money issue. Hämmerli: “Budgets always have limits; no organisation in the world has the funds to completely seal off a system.”

The World Economic Forum suspects that some security suppliers themselves could be in on the hacking game. In their Global Risks Report, the Forum stresses one of the key challenges in cyber security, that ‘incentives are misaligned’: vendors of online security products have a financial interest in talking up the threats of cyber crime, while the victims often have an interest in remaining silent. It believes correcting such ‘information asymmetries’ should be at the centre of policies to improve global cyber security.

Fire sale
Security professionals turning into hackers brings us to the summer blockbuster of 2007. Die Hard or Live Free stars Bruce Willis as an analogue cop in a digital world. While escorting a young hacker to the FBI, Willis finds himself in the middle of a fire sale, a state of utter chaos caused by the simultaneous hacks of several critical systems including utilities, traffic management and communications. This large scale hack is performed by former US government security adviser Thomas Gabriel, who is proving a point: he warned in vain that such a large scale attack was possible and is now causing mayhem.

The world has yet to witness a real fire sale consisting of simultaneous hacks against critical infrastructure but if past movies about hackers are anything to go by, we should see one in about five years. Probably not as spectacular as in the movies – hacking in real life never is.

For critical infrastructure IT professionals from around the world, it is only a matter of time. In a survey by security firm McAfee of 600 IT specialists from 14 countries, more than half the respondents think we will witness large scale attacks within the next few years.

The internet of things
Robbert Kuppens, Chief Information Officer for Cisco Systems in Europe, the Middle East and Africa also thinks it could be on the cards. His company manufactures a large portion of the infrastructure that powers the internet so must remain one step ahead of the hackers. According to Kuppens new threats are constantly lurking in the dark corners of cyberspace; there is no room for complacency with more and more devices, such as smart electricity meters, connecting to the internet. The US Governmental Accountability Office (GAO) recently underlined this in a report on Electricity Grid Modernization, with the realistic headline ‘Progress Being Made on Cybersecurity Guidelines, but Key Challenges Remain to be Addressed.’

Shockingly slipshod
Kuppens: “We are currently heading for the internet of things, in which many devices that were until now offline will connect to the internet, either by cable or wireless. All these new devices are potential leaks for the networks they are connected to, so you should secure them all. Don’t think for a moment that a device will not be hacked because it does not look like a computer. Take mobile phones. Until recently a lot of people thought they could not be hacked, but now we know that is not true anymore.”

Kuppens says that a lot of companies and governments are very security conscious. But he also regularly encounters critical systems, both public and private, protected by shockingly slipshod security measures. “One of the biggest problems is that security is often just an afterthought. And that the people who make decisions about investments in hardware and software are sometimes ill-informed. Security costs money, while its benefits are often not immediately clear to the layman. And if there are security-conscious IT staff in an organisation, we find they lack strong support from management to invest in the necessary hardware and software.”

Security and security management nowadays ask for a holistic approach. It is no longer a responsibility of IT only, but of the organisation as a whole.

©iStockphoto

"No organisation in the world has the funds to completely seal off a system"

Next target: energy supply
So, where could a large scale attack take place? Kuppens thinks – and Hämmerli, Luijjf and Solange Ghernaouti–Hélie agree – the energy supply is a logical target. In Europe, the management of electricity is often centralised with one organisation controlling the whole electricity supply. Electricity grids are often managed online which increases the risk of a breach of security at the central level. In a worst case scenario, an attack could shut down the electricity in a whole country or even the whole of Europe.

Cyber incidents have already taken place in energy facilities. In 2009, at a hearing for the US Congress, it was stated by US national security officers that cyber spies had compromised the electrical grid of the United States and installed software programs that can disrupt the system when activated by a hacker.

In a testimony for a committee of the US House of Representatives, the US Governmental Accountability Office (GAO) cited four incidents concerning energy plants. Apart from Stuxnet in Iran, the GAO believes that in 2006 the failure of two circulation pumps at Browns Ferry, a US nuclear power plant in Alabama, was caused by cyber security breaches. In 2003 an alarm processor in FirstEnergy, an Ohio-based electric utility, failed, resulting in the cascading failure of 508 generating units at 265 power plants across eight US States and a Canadian province.

Earlier that same year a worm known as Slammer infected a private computer network at the Davis-Besse nuclear power plant in Oak Harbor, Ohio. It disabled a safety monitoring system for nearly five hours. In addition, the plant’s process computer failed, and it took about six hours for it to become available again.

James Lewis, cyber specialist at the American Center for Strategic & International Studies (CSIS) has been keeping a ‘significant cyber incidents’ list since 2006. According to this list, Norway’s National Security Agency (NSM) reported that in 2011 at least 10 major Norwegian defence and energy companies were hacked: “The attacks were specifically ‘tailored’ for each company, using an email phishing scheme. NSM said that the attacks came when the companies, mainly in the oil and gas sectors, have been involved in large-scale contract negotiations. The hacking occurred over the course of 2011, with hackers gaining access to confidential documents, industrial data, usernames and passwords.”

Holistic approach
So, how do we deal with these threats? The response from governments is a mixed bag, according to security specialist McAfee. Governments continue to play an ambiguous role in cyber security – sometimes helping the private sector, sometimes ignoring it. The US and the UK are taking the lead in developing cyber security strategies and have made cyber security a top priority in their national security programmes. The US has its Cyber Command, the UK its Government Communications Headquarters. GCHQ director Iain Lobban, reported in The Guardian, has no illusions about the scale of the threat: "Cyberspace is contested every day, every hour, every minute, every second," he said. "I can vouch for that from the displays in our own operations centre of minute-by-minute cyber-attempts to penetrate systems around the world."

The EU is slowly stitching together a holistic approach. In 2011, the European Commission published the Communication Achievements and Next Steps: towards Global Cyber-security. It focuses on the global dimension of the challenges and the importance of boosting cooperation among EU states and the private sector at national, European and international level. The EU is striving for more awareness and preparedness.

European member states are rapidly installing national CERTS (computer emergency response teams) while ENISA, the EU’s cyber security agency, issued a thick study on industrial control systems (ICS) security. Derived from a hundred key findings, the report proposes seven ‘urgent’ but ‘challenging’ recommendations for improving ICS security. The recommendations call for national and pan-European ICS security strategies, a Good Practice Guide on ICS security, research activities, spreading awareness, the establishment of a common test bed and ICS-computer emergency response capabilities. ENISA stresses the importance of active collaboration between public organizations and the private sector. Earlier this year ENISA saw its mandate extended after the successful coordination of the first pan-European cyber security exercise. This was, reported German think tank Bertelsmann, despite criticism for its location on Crete, making it hard to attract qualified IT staff.

New laws needed
Across the world, reports are written, tough words are spoken, action lists formulated. But stopping hackers interfering with our critical infrastructure seems not to be so easy. Existing regulation is not enough, the experts say. International laws and international or even global cooperation is the key, as these are often cross-border crimes with major jurisdiction issues. That’s if you can even identify where an attack comes from. There must be a new framework. “We need new laws. We should determine internationally what is and what is not punishable when it comes to the internet”, according to Cisco’s CIO Kuppens. “While politicians tend to look at their own back yard, the virtual world knows no borders. Something that is prosecutable in one country is allowed in the next. We should have treaties about what constitutes a cyber crime and how and by whom it should be punished. Perhaps we could establish a WTO-like organisation to battle cross border online crime.”

NATO rapporteur Jopling proposes just that: “On the global level, NATO should support initiatives to negotiate at least some international legal ground rules for the cyber domain. International law should clearly prohibit the use of cyber attacks against civilian infrastructures.” Jopling also called for NATO member states to hurry up when ratifying binding international treaties, like the Council of Europe’s Convention on Cyber crime, because banning cyber criminal activities would also help in dealing with cyber terrorists and state-sponsored cyber attacks that often use the same techniques as cyber criminals.

A role for the UN?
Professor Ghernaouti–Hélie sees a role for the UN. According to her, this is the only international organisation with sufficient clout to author an enforceable code of online conduct for states, companies and individuals. “We need to integrate security in every piece of technology that is coming on the market. Only an international organisation like the UN can force the market to do that. We need a UN charter for the internet that establishes what you can and cannot do online.”

Chances are slim however, that such a scenario will unfold. Although the UN is working on cyber security, through the UN General Assembly and through the International Telecommunications Union, there is as yet no UN Cyber Security Department. A spokesperson for the UN says there are to date no plans for a charter, new laws or a conference on the subject. International law specialists question the UN’s capacity on this subject because since the nineties, the conclusion of international treaties has taken a sharp decline. Most plausible is that bilateral treaties and regional, or if possible, global partnerships, might help generate some agreement on establishing cyber security. In the meantime nations, owners of critical infrastructure, and the rest of us, are left to fend for ourselves.

Where is the Hollywood superhero to keep us safe in cyberspace?

For this article, Volta used the sources listed below – and many more.They might be a reference point for your research.

INTERNATIONAL COOPERATION


Achievements and Next Steps: Towards Global Cyber-Security
European Commission Communication on Critical Information Infrastructure Protection (Brussels, 2011)

A List of Significant Cyber Incidents since 2006
James Lewis, CSIS (2006-2012)

Baseline Capabilities of National/Governmental CERTs -Part 2: Policy Recommendations
EU (2010)

Defending the Networks – The NATO Policy on Cyber Defense
NATO (2011)

Global Risks Report
World Economic Forum (2012)

Information and National Security – Draft General Report
Lord Jopling (United Kingdom) General Rapporteur, NATO Parliamentary Assembly (2011)

Nato builds its Cyberdefences
Stéphane Abrial, The New York Times (2011)

Protecting Industrial Control Systems – Recommendations for Europe and Member
States ENISA (2011)

Reducing Systemic Cyber-security Risk
OECD -IFP Project on Future Global Shocks- Peter Sommer and Ian Brown (2011)

Rethinking Cybersecurity – a Comprehensive Approach
James Lewis, CSIS, 2011

ACADEMIC


Cyber norm emergence at the United Nations – An Analysis of the Activities at the UN regarding Cyber-security
Tim Maurer – Belfer Center for Science and International Affairs (2011)

Cyber War Will Not Take Place
Thomas Rid – Journal of Strategic Studies (2012)

Cyber Weapons
Thomas Rid and Peter McBurney
The RUSI Journal (2012)

Security Economics and Critical National Infrastructure
Anderson and Fuloria, Springerlink (2010)

Software Failures, Security, and Cyberattacks
Charles Perrow (2008), Schwerpunkt, Technikfolgenabschätzung (2011)

MISCELLANEOUS


Anonymous says Power Grid Concerns are U.S. Gov’t Spin
SC Magazine (2-02-2012)

Assuring a Trusted and Resilient Informations and Communications Infrastructure
US Cyber Space Policy Review (2010)

Critical Infrastructure Protection: Key Private and Public Cyber Expectations Need to be Consistently Addressed
GAO (2010)

Cyber Security in the UK
Post – Postnote (2011)

EU Cyber Security Policy
Eurowire – Bertelsmann (2011)

Hackers reportedly linked to ‘Anonymous’ group targeted in global operation
Press statement, INTERPOL (2012)

In the crossfire. Critical infrastructure in the age of cyber war
McAfee/Center for Strategic and International Studies (CSIS) (2009)

In the dark. Crucial industries confront cyberattacks
Baker, Filipak and Timlin, McAfee/CSIS (2011)

The meaning of Stuxnet – A sophisticated “Cyber-missile” highlights the Potential —and Limitations—of Cyberwar
The Economist (2010)

The Stuxnet Outbreak -A Worm in the Centrifuge
The Economist (2010)

W32. Duqu – The Precursor to the next Stuxnet
Symantec (2011)

ONLINE READING


Nato
NATO has an online library which provides a ‘few starting points to assist you with your research on issues related to cyberspace security, in particular, in the NATO context. See www.natolibguides.info/cybersecurity

EU Policy on Network & Information Security
ENISA
EU Digital Agenda website

GAO – Cyberwar resources Guide

Cyber attack timelines
Italian IT specialist Paolo Passeri collects cyber attacks and puts them in daunting monthly and yearly timelines.

Computer security
Fellow TA colleagues at ITAS / KIT are working on Compartmentalised Computer Security (CCompS), trying to isolate operating systems and applications differing sensitivity or risk from one another.

 

Text: Philip Dodge and Pascal Messer.

Illustration: Petit Comitè.

]]>
http://volta.pacitaproject.eu/2-what-can-be-hacked-will-be-hacked/feed/ 1
European power struggles http://volta.pacitaproject.eu/1-energy-technologies/ http://volta.pacitaproject.eu/1-energy-technologies/#comments Tue, 17 Jul 2012 07:41:24 +0000 http://volta.pacitaproject.eu/?p=63 Can public resistance be overcome?

Call it people power. Or rather, people against power. Ordinary Europeans have become experts in delaying or even stopping the introduction of energy technologies. How can public resistance be overcome?

 

It’s not the technology, it’s the way you use it

There have been protests all over Europe against wind farms, geothermal facilities and other green initiatives. People are organising, demonstrating, and attempting to elect politicians who promise not to build anything. If that isn’t enough, citizens are using the courts to tie up planners and builders of new energy technologies for years, often with the help of local municipalities or environmental groups. Yet if we want to have energy in the future, we have to build new power generating facilities, if only to replace the old ones. And not too far away from large population centres either, so as not to waste power. How can policymakers, politicians and planners overcome the serious lack of societal acceptance of future energy plans? Transparency is the key, says Stefan Gold, from the Institut de l’entreprise, Université de Neuchâtel in Switzerland. He researches stakeholder management in energy production: “Politicians who are planning any kind of energy facilities have to be completely open about their plans. Honesty is the only policy, any kind of deceit or ambiguity will come back to haunt you.”

Blind spot

There’s no doubt public participation complicates things. Indeed, communicating with local communities is a bit of a blind spot for most of our leaders. In a democracy, politicians are chosen by us to make tough decisions. Strictly speaking, they can ignore protests and plan new facilities wherever they like. But involving the general public should be an integral part of the decision making process according to Gold, because it enhances the legitimacy of the choices a government makes.

People have a deep mistrust of planners and politicians who lack transparency
Special Report – Energy Technologies

While you are probably not going to come up with a single location for a future wind farm or biomass facility that is acceptable to everyone, with greater citizen involvement, the dilemmas involved can be seen by all; the process is clear. There is no ‘democratic deficit’ in the planning process and hopefully politicians gain a better understanding of the societal impact of the project. “By consulting with those who live near a future site you also get a clear idea of the preferences of the population,” continues Gold. “You can build consensus among a large part of the population and garner support. Of course, there are always going to be people who are against building anything anywhere. And they can be very strident about it.”  Nimby? Pimby? Banana?

All over Europe, people have found effective ways to kick planners out of their backyards. Protesters in Wales stopped the building of a biomass power plant in Port Talbot. Tidal power projects were cancelled in Ireland. Fishermen in France torpedoed an offshore wind park comprised of 100 turbines in the Arromanches. There is European-wide resistance against shale gas drilling and carbon storage injection projects [see text boxes] while the European Platform against Wind Power group unites turbine haters across the continent. The Nimby (Not In My Backyard) syndrome is often believed to be the problem. We all want electricity, the theory goes, but we do not want it to be generated anywhere near us. As the list of failed energy projects goes on and on, a new acronym was coined a couple of years ago. Some say we have now advanced to Banana: Build Absolutely Nothing Anytime Near Anyone. Should we give up on a greener future? No, says Maria Pia Misiti, secretary of the Associazione Pimby in Italy. Her organisation – the name is a pun and stands for ‘Please in my back yard’ – tries to get planners, politicians and the general public to engage in dialogue in order that fewer projects fail. “We studied all the cases in which people successfully opposed infrastructure projects and found flaws in the government’s communications every time”, she states. But there were also similarities between the successful projects: local communities almost always gained something. Does she mean you can buy the support of communities? “It is not so much buying as compensating. The country needs a new road or a power plant, but what do the local people need? It could be a park, a local road or a community centre. When you lose something, for instance an uninterrupted view or peace and quiet, it is reasonable you should get something in return.” The Pimby manifesto was signed by politicians, community leaders and those responsible for Italian infrastructure. The next step is to get the central government to sign a law that makes it mandatory for planners to compensate local communities. As a bonus, they have to communicate with people near the site at an early stage. It’s something that Italy needs in order to move into the future, according to Misiti. “Some infrastructural projects in our country are delayed for decades because authorities and communities are battling it out in court. By making it compulsory to negotiate about a project at an early stage, we hope to move forward.”

Top-down decision making

As the Italian example shows, the struggle of local communities is not so much with technology, as with failed processes and rigid top-down decision making. A lot of research on societal sustainability corroborates that. Nimby is an empty concept, scholars say. It is a simplification of a complex interaction between governments and the general population. Some academics believe that acknowledging the Nimby concept actually hinders policymakers and energy companies in achieving public acceptance of energy technology. “The recognition of any Nimby-motivated resistance has become a weapon in the small wars that are fought to influence place-making decisions. It is the ultimate legitimisation for not considering the arguments that are put forward. This practice of disregard of important elements of the issue is counterproductive, though, and it might eventually become one of the major sources of societal resistance”, writes researcher Maarten Wolsink in a ‘critique on the persistence of the language of Nimby’. There are long-term risks, continues Wolsink: “As the opponents as well as their arguments are lumped together and collectively ignored, their acts will rapidly turn into strategic behaviour only, focusing on obstruction, rather than on adjustment and influence. All studies on location conflicts, including those that claim to look at ‘Nimby and beyond’, show that it is not a wise policy strategy to disregard the objections.”

Clean Green

Psychologist Gundula Hübner of the Martin-Luther- University in Halle-Wittenberg, Germany, studies the acceptance of green technologies and environmental law by the general public. Despite all the resistance against energy technologies, she thinks implementing cleaner technologies could be the answer. Technology is contrarily the solution as well as the problem. “When it comes to public opposition, there is a huge difference whether you want to build a nuclear reactor, a coal firing plant or a wind farm. People have a clear preference for green technologies”, Hübner claims, citing a study of public views on power lines in Germany and England: “We do still have opposition if the lines are for green. However, people are less sceptical. When they assume the lines are used for nuclear power or coal they object to them more strongly.” But what about the protests against wind farms? Doesn’t that prove that even they are not readily accepted? Hübner also points to the examples of communities which volunteer to have wind farms. It is not the turbines as such that people usually have trouble with, but who decides where they will be erected. “Research shows people have a deep mistrust of planners and politicians who lack transparency. They want to be in on the decision making process, not stand on the sidelines and wait for others to decide about their region. Go over people’s heads and they are going to block any decision you make. Use their knowledge and you might be surprised how cooperative people really are.”

Shale gas ignites local hostility
‘Environmental protests could kill the nascent industry in Europe’

Small earthquake

The bad news is that drilling for shale gas is a disruptive – and controversial – procedure. The layers of rock that contain the gas have to be fractured using hydraulics for the gas to escape, a technique called ‘hydraulic fracturing’ or simply ‘fracking’. Basically, that means causing a small earthquake. Water, sand and toxic chemicals are pumped deep underground at a tremendous pressure to break up the layers of shale and release the gas so it can be pumped to the surface. More bad news is that fracking has become a serious environmental and health issue with a moratorium in place in New South Wales (Australia), Karoo basin (South Africa), Quebec (Canada), and some of the states in America.

Frack off

Public protest against fracking is on the rise. Earlier this year, public pressure forced France to become the first nation to officially ban the technique. Internationally, protesters are organizing rapidly and exchanging information through Youtube and websites such as Frack-off.org.uk. The UK recently saw its first ‘Frack Mob’ mass action, where protesters halted work at a drilling site in Hesketh Bank, Lancashire. Protestors question the potential contamination of ground water, earthquakes, risks to air quality, the potential migration to the surface of gases and chemicals involved in the fracking process, the potential mishandling of waste, and the health effects of all of these. On Youtube some Americans claim that shale gas leaking into their drinking supply caused tap water to ignite. None of the counter-arguments from politicians and companies seem to convince the public. Bruno Vigier is the mayor of Les Vans, a town in the French Ardèche that stopped energy companies from drilling. Vigier himself sided with the protesters: “I was angry and shocked that we were not informed about the decision to drill near our town. As soon as we saw the plans, we knew that it was going to cause great damage to the environment. That is contrary to our policy of protecting nature and having clean rivers and lakes for tourists to visit.”

Local Hostility

According to Oxford Institute for Energy Studies researcher Florence Gény, the biggest challenges to full-scale production of shale gas in Europe will be cost and land access. “Land access is a huge issue linked to severe spatial restrictions resulting from high levels of urbanisation in North Western Europe; extensive regulatory protection of sites and landscapes; and difficulties in accessing private land due to local hostility”, she writes. Gény advises the involvement of operators to develop mechanisms that incentivise landowners and integrate stakeholders in decisions impacting local socio-economic and environmental conditions. But perhaps most importantly, if the industry is to develop in Europe, she says there must be: “Better communication on environmental impact and responses to growing public concerns arising from US operations. Environmental issues could be a killer to the nascent industry in Europe, as it could be a serious brake to US shale gas operations. We think the US needs to clear its environmental debate before Europe can fully embrace unconventional gas.”

Read more?
Climatic Change Letters
Robert W. Howarth, Renee Santoro and Anthony Ingraffea (2011)
www.propublica.org
Propublica -‘Investigative journalism in the public interest’.
Pullitzer prize-winning journalists track US gas drilling.
www.frack-off.org.uk – don’t frack with the UK.
Can Unconventional Gas be a Game Changer in European Gas Markets? Florence Gény, Oxford Institute for Energy Studies (2010)

No sense of urgency

Senior researcher Jurgen Ganzevles of the Rathenau Instituut, a Dutch technology assessment institute, recently painted a grimmer picture of the acceptance of green technologies. In a comprehensive study of future energy systems, Ganzevles states how all energy technologies are controversial, ‘whether new or old, grey or green’. The root of the problem is the lack of a sense of urgency felt by both the public and policymakers, resulting in collectively shared myths about an easy and painless transition to sustainable energy systems – lullabies that send people to sleep. Key to Dutch local resistance, Ganzevles believes, is the non-existence of a firm national political strategy on the future energy mix. He advises Dutch politicians and policymakers to rapidly start educating the general public. People need to realise that painful choices will ultimately have to be made if they want clean, affordable and reliable energy in the future. Good government communication and collective knowledge might well help tackle public resistance in the future.

Special Report – Energy Technologies

Keep talking

Hübner believes governments and local populations often communicate on different levels, perhaps even in different languages: “A civil servant is used to working with facts and figures. His boss tells him he wants to generate more wind energy, so he consults a wind map to decide where to build the turbines. If you live near that place, you do not care about wind charts. Your response is based on your emotions. These turbines might produce noise and they are going to spoil your view.” Smart planners use local expertise to find the right location, but whatever happens, keep talking. In the German region of Niedersachsen, a wind developer had to go to court to win permission to build a windfarm. It won; local protesters had to accept that their horizon would include turbines. It would have been very easy for the developer to build on the site and ignore its neighbours. But when the court battle was over, the company went back to the community. A plan was drawn up to have an independent authority measure the noise of the turbines and if they were deemed to be too noisy, the company would take action. Hübner: “I know the measurements are going to be impartial because they asked me and my colleagues to do them.”

CO2 storage
‘Glossy brochures are not the right communication tool’

Saying a planned facility is ‘green’ or ‘safe’ won’t work – at least not in  Germany or the Netherlands. ‘There is a great enthusiasm for science. However, when scientific discoveries are transferred into technology, opposition comes forth.’

In September 2011, the Bundesrat, Germany’s parliament, blocked a law allowing the storage of carbon dioxide underground in a bid to reduce emissions. The government must now come up with a revised bill to conform to a directive from the European Union on the technology. One year earlier, the Dutch national government had to announce that a similar test site underneath the residential area of Barendrecht, near Rotterdam, was to be scrapped. “The three year delay to the project and the total lack of support in the locality were the main reasons behind the decision”, economic affairs minister Maxime Verhagen said.

Buying time

Carbon Capture and Storage (CCS) is a relatively new technique for permanently storing the greenhouse gas CO2, in order to curb emissions. It is pumped out from fossil-fuel burning plants or from industrial processes, liquefied and then buried underground, usually in disused natural gas storage chambers. It’s a technology that has been used in gas fields under the sea, but not near populated areas. In Europe, many of the pilot projects are being partly funded by the European Union. CCS is seen as a way of buying time for politicians to forge an effective treaty on greenhouse gases and wean the global economy off fossil fuels.

Not in our community

In both the German and Dutch cases, national governments met with fierce opposition. Locals feared gas leakages or the possibility of explosion-like uncontrolled emissions. In the case of Barendrecht, the local community also feared a decrease in property values. In Beeskow, a quiet town in the eastern German state of Brandenburg, local Mayor Frank Steffen said: “A field trial under our community is not acceptable.” Critics believe that the large amount of investment required would be better spent on renewable sources of energy, such as solar and wind power, or on nuclear power. “In my view, CCS is fundamentally wrong”, said mayor Steffen to newspaper Der Spiegel. “It was invented to keep the old-fashioned way of producing energy from coal alive.” In an interview in newspaper Der Tagesspiegel, Brandenburg’s economy minister, Ralf Christoffers said: “In Germany, there is a great enthusiasm for science. However, when scientific discoveries are transferred into technology, opposition comes forth.” He pointed out that public resistance to CCS goes hand in hand with opposition to building a new power infrastructure for renewable energy – notably wind power. “The focus of our energy policy is the expansion of renewable energies. That is a huge problem, because the resistance is growing. In Brandenburg, we must build nearly 1000 kilometres of new lines for electricity but we need acceptance.” When asked how to achieve acceptance, Christoffers stated: “You need to talk to the people.”

Safe soda?

Indeed in both CCS cases, poor communication seems to be the problem. Carbon dioxide gas is odourless and not in any way dangerous, local communities were told. And even if the gas did somehow escape to the surface, the risk, it was said, would be zero. If it were to creep into the drinking water supply, as some people feared, scientists said it would merely carbonate the water, not unlike a soda. Wrong, says stakeholder management researcher Gold [see main story]. “There should be no absolutes in risk-communication. Never pretend to have all the answers. Admit that it is a relatively new technology and you expect risks. Don’t feign certainty.” The people of both Barendrecht and Beeskow proved Gold right. They started lobby groups and organized fierce protests. Politicians, planners and engineers assured locals that nothing could go wrong, but to no avail. Citizens remained passionately opposed to the plans. Jurgen Ganzevles, senior researcher at the Rathenau Instituut in the Netherlands, believes the Dutch authorities should have painted a much broader picture instead of underlining the safety of the technology. “What they should have done in Barendrecht is demonstrate how important heavy industry is to both the local and national economy. Only then do you explain that a former gas field is a good place for storage. Concerns about safety will be seen in a different light after that.”

Lessons learned

Meanwhile, the European Union has established a Network of CCS demonstration projects, ‘to generate early benefits from a coordinated European action’. In May 2011, the experiences and lessons learned from six full-scale European CCS demonstration projects were shared with the public in Rotterdam. In their online newsletter, the CCS network dryly reports: “Ignoring stakeholders and under-estimating the influence of the local community is likely to cause delays.” It’s a lesson learned, the network writes: “It is generally felt that development of and engagement in dialogue, especially with local stakeholders, is to be preferred above one-sided dissemination of ‘corporate’ project information. This is especially true for those projects who foresee onshore storage of CO2.” The network gives an example of how not to communicate with local people. In 2009, Vattenfall, a Swedish energy company involved in CCS, announced CO2 storage plans using its ‘standard’ communication format. But it became clear that “glossy brochures are not the right communication tool in order to get local people to trust the company.”

Dread factor

In the 59-page Thematic Report on Public Engagement, the CCS network concludes that an important purpose of public engagement is the ‘challenging’ task of communicating and educating the public about the risks related to CCS: “Whereas analysts and risk experts tend to employ quantitative risk assessments to methodically evaluate hazards, the majority of citizens rely on intuitive risk judgments, called ‘risk perceptions’. The public may regard CCS projects as a new technology and not necessarily trust experts’ claims that it’s safe. Furthermore, the distribution of risks and benefits are bound to be perceived as uneven (since some people must be the ones living closest to a storage site). The hazard of a leak is difficult to observe for ordinary people. Also risks may be amplified through social mechanisms, according to the report, “thus contributing to the dread factor.”

 

Read more?
Public Engagement: Lessons Learned in 2010 – A Report from the European CCS Demonstration Project Network
www.ccsnetwork.eu

Dancing Ladies

Listening to local people is what Andy Clements does very well. But then again, the chairman of Gigha Renewable Energy has no other choice. As is usual on small Scottish islands, Clements has more than one job. He is the local fire chief, the head of island maintenance and a farmer. But most importantly, he runs three wind turbines on the island of Gigha (pronounced Gee-ya). This gorgeous speck in the Mull of Kintyre (of Paul McCartney fame) is energy self-sufficient thanks to the wind, with the electricity surplus sold to the mainland. Was there opposition before the turbines were built?

It’s the communication,

stupid!

Communication

Communicating with locals and the general public is a key factor to public acceptance. It is a two-way process, so do not just send your message, but listen first. 

Forget about Nimby

The Nimby label often turns into a depreciative disqualification of public protest. It can be perceived as an attempt to qualify opponents beforehand, in terms of ‘others’, or at least ‘the other side’. This leads to conflicts.

Paint the whole picture

Explain your policy of making the energy supply more sustainable, and explain why the biogas plant you want to build is a crucial part of that plan. 

Show and tell

Explain to the general public what choices have to be made, why, and how you make them. Do not withhold information. It will come back to haunt you. 

Use local knowledge

People who know an area can help you choose the right location for a new power station. 

Let the locals benefit

Supply local communities near a wind farm or geothermal power station with cheaper electricity or other benefits. 

Listen

Never. Stop. Listening. Even after you build a (green) power station, keep lines of communication open.

 

You might be surprised how co-operative people really are

A lot of discussion, says Clements. But what clinched it for locals was managing the project as a community. He recently visited a remote part of Norway where local tensions were running high. Why? “Because all of the turbines and biogas installations were privately owned. So people were irritated by the sounds and smells of other people’s businesses. I gave them one piece of advice: solve your energy needs together. Don’t build five small turbines, build one big one.” The ‘dancing ladies’ as locals call the Gigha turbines, have saved the island and Clements believes it’s a model that could work in many places. “You can see a wind park on the mainland from here. It is owned by a big electricity company. The people who live next to it are not happy. We love our turbines; they pump new life into our community. Even the guy who lives right next to them is comfortable with them, because otherwise our existence here would be a lot harder. It’s not the technology that makes the difference, it’s the way you use it.”

Read more?
Social Acceptance of Renewable Energy innovation: an introduction to the concept.
Rolf Wüstenhagen, Maarten Wolsink, Mary Jean Bürer (2006)

Sustainability Assessment of Energy Technologies via Social Indicators: Results of a Survey among European Energy Experts. Diana Gallego Carrera, Alexander Mack (2009)

Invalid Theory Impedes our Understanding: a Critique on the Persistence of the Language of NIMBY. Maarten Wolsink,
Transactions of the Institute of British Geographers (2006)

The Relative Importance of Social and Institutional Conditions in the Planning of Wind Power Projects. Susanne
Agterbosch, Ree M. Meertens, Walter J.V. Vermeulen (2007)

 

Accept, consider, reject?


How do we form opinions on issues like energy and technology? What information influences us in shaping our views? Volta takes a look. 

Context matters You are confronted with a new technology you know next to nothing about. Do you already have an opinion about it? You might, according to research by Eindhoven University in the Netherlands. Researcher Wouter van den Hoogen found that people base their positive or negative emotions about a new technology on the context in which they are exposed to it. “If another energy source was casually mentioned just before the assessment of biomass, then their opinion about the use of biomass was assimilated to the use of the other energy source”, writes AlphaGalileo about the research.

Context Affects Opinion about Novel Energy Sources
Eindhoven University of Technology, July 26, 2007

 

Wind farms? Mais Oui!
France is planning to substantially increase the amount of electricity generated through wind power by 2020. But this government-sponsored programme could be seriously delayed if the acceptance of wind turbines is low. So a study was conducted in four coastal regions of France where multiple wind farms are already operating. It shows there is high acceptance of wind turbines among those who live near them. Not only do the respondents at the four sites have a positive perception of wind energy in general, but they also look proprietarily at ‘their’ wind farm. Only five percent of those whose home is near a wind farm believe that turbines are a bad idea.

Evaluation of Some External Effects Produced by Wind Turbines
Commissariat général au Développement Durable (CGDD), June 2009

Local positivism

People like renewable energy (or say they do). So why the protests when a renewable power station is planned near their home? According to this report, the acceptance of new energy technologies at the local and regional level is first and foremost shaped by non-technological aspects. History plays an important role. Have other projects in the region contributed to a general opinion that green energy is a good thing? Then people are a lot keener to support new local projects, the authors suggest: “Positive experiences gained at individual sites can expand to a broader regional level or even influence national policies”.

Factors Influencing the Societal Acceptance of New Energy Technologies: Meta-analysis of recent European Projects 
E. Heiskanen et al, Create Acceptance, 2008

To compensate or not?

Compensation is a favourite government strategy to overcome low acceptance for large energy infrastructure projects. Governments use it as a cure-all when confronted with resistance from local populations but confidence in compensation is excessive and the costs associated with it sometime prohibitive. A better way would be to stimulate dialogue, this study concludes.

The Location of Regassification Plants, is Compensation a Cure-all?
Matteo Bartolomeo, Politecnico di Milano, 2007

Risk-free Green please

The Bureau for Technology Assessment of the German Parliament (TAB) has been measuring technology acceptance opinions since 1997. Although Germans have a positive attitude toward technology and technological advances, it is also ambivalent: when questioned about the impact of technological progress, a significant number of respondents selected a negative or undecided option. While the acceptance of green technologies remains high, the acceptance of technologies that are perceived as dangerous, like nuclear energy, has dropped dramatically.

Monitoring Technikakzeptanz und Kontroversen über Technik; Positive Veränderung des Meinungsklimas,
TAB 1997-2009

 

Rethinking Nimby

Selfish? Or the expression of a desire for a better environment and quality of life? This paper urges a rethink of the Nimby (not in my back yard) syndrome. “Generalized distrust [has] hidden deeper reasons from view”, the authors write. Nimby syndrome could be a way to bring hidden conflicts in society out in the open and “help translate perceptions and intentions and build partnerships between various civil society members and between them and government bodies.”

The Nimby Syndrome and the Health of Communities Canadian Journal of Urban Research,
Senecal et al, 2006

Energy choices for Europe. Who decides?

Big-tech and Small-tech are two ‘essentially different’ development pathways for the European energy sector, according to this report commissioned by the European Parliament. We can opt for a scenario in which new gas and coal firing power plants are built with CCS technology to curb emissions. Or we can decentralise power generation to smaller wind farms, bio mass facilities and other green technologies and concentrate on energy saving. In the first case we end up using more energy and polluting the landscape in a few places, in the second we use less energy, but have more turbines and other structures on our horizons. Or can we have both?

Future Energy Systems in Europe 
European Parliament Science and Technology Options Assessment, 2009

Text:
Philip Dröge and Pascal Messer

Photos:
© Masterfile, Agefotostock

]]>
http://volta.pacitaproject.eu/1-energy-technologies/feed/ 1