|
By RICHARD GRAY FOR MAILONLINE
PUBLISHED: 08:26 GMT, 16 March 2016 | UPDATED: 16:26 GMT, 16 March 2016
It is a country more famous for its violent Viking past and the melancholy brooding of Hamlet, but Denmark has emerged as the happiest nation in the world. The Scandinavian country, which has a population of around 5.6 million, knocked Switzerland into second place to take the title of the happiest country in the 2016 World Happiness Report.
A combination of a relatively high GDP, good healthy life expectancy and high levels of social support gave the country its edge over the other nations. Despite having high taxes and a reputation for a fierce Viking past, people in Denmark have been ranked as the happiest in the world (stock picture) in the latest edition of the World Happiness Report. It found people in the Scandinavian country could expect high income, good healthy life expectancy and social support. The country is also well known for its generous state welfare and for its egalitarian nature.
It was closely followed by Switzerland, Iceland, Norway and Finland as the next happiest places to live.
At the other end of the spectrum, Syria and Burundi were ranked as the least happiest countries out of the 157 assessed.
The report was released ahead of the UN's World Happiness Day this weekend. Jeffrey Sachs, from Columbia University, who helped write the report, said that happiness and well-being should be on every nation's agenda.
He said: 'Human well-being should be nurtured through a holistic approach that combines economic, social and environmental objectives. 'Measuring self-reported happiness and achieving well-being should be on every nation's agenda as they begin to pursue the Sustainable Development Goals. 'Indeed the Goals themselves embody the very idea that human well-being should be nurtured through a holistic approach that combines economic, social and environmental objectives. 'Rather than taking a narrow approach focused solely on economic growth, we should promote societies that are prosperous, just, and environmentally sustainable.'
The report is the fourth of its kind to be produced, using a variety of factors to measure happiness around the world since the first World Happiness Report in 2012.
People's health, access to medical care, family relations, job security, political freedom and levels of government corruption are all among the measures used to assess happiness.
China was ranked in 83rd place and Russia came in 56th place.
Some countries that have experienced dramatic turmoil in recent years also saw significant drops in their rankings since the previous report in 2015.
Greece, for example, slipped from 120th to 126th place while Spain and Italy, which were also hit hard by the Eurozone crisis als
o fell.
Despite high taxes, Denmark was able to take the top spot because much of these is reinvested in schools, universities and free access to healthcare. Students are given monthly grants for up to seven years while 43 per cent of the top jobs in the public sector are held by women. Many feel confident that if they lose their jobs or fall ill, the state will support them.
The country also has to deal with few natural disasters and has little corruption. Knud Christensen, a 39-year-old social worker from Copenhagen, said 'We have no worries.'And if we do worry, it's about the weather. Will it rain today, or remain gray or will it be cold?'
Kaare Christensen, a university professor in demography and epidemiology in Odense, where fairy tale writer Hans Christian Andersen was born, says it doesn't take much to satisfy Danes. 'They are happy with what they get. Danes have no great expectations about what they do or what happens to them,' she said
Christian Bjoernskov, an economy professor at the University of Aarhus, Denmark's second- largest city, believes feelings self-assurance and self-determination have a lot to do with it. 'Danes feel confident in one another... when we stand together we can succeed,' he says. 'And they also have a strong belief they can decide their own lives.'
The Roman Catholic Church has welcomed the study, declaring that happiness is 'linked to the common good, which makes it central to Catholic social teaching,' according to Bishop Marcelo Sanchez Sorondo, one of Pope Francis' key advisers at the Vatican.
Read more: http://www.dailymail.co.uk/sciencetech/article-3494634/Danes-spot-world-happiness-report.html#ixzz43DCFmVJY
Follow us: @MailOnline on Twitter | DailyMail on Facebook
<Questions>
Q1. What is your definition of Happiness? What are the basic requirements for your happiness?
Q2. Recall the moment that you felt happiness during this week days. Please share your stories with other members.
Q3. Do you feel happiness in your current life? Please tell us some of the reasons that make you happy or unhappy.
Q4. People's health, access to medical care, family relations, job security, political freedom and levels of government corruption are all among the measures used to assess happiness.
What grade would you give South Korea in these criteria? or Please judge your nation !
Q5. Do you think money can make you happy? How much money is enough for you to be happy? What will you do with that amount of money?
Q6. According to the survey, Denmark was ranked as happiest nation in the world. Scandinavian country has a high GDP, good social welfare and high medical care compare to the other nation.
So, If your government ask you pay more tax to reinvested in schools, universities and free access to healthcare for you and your kids, would you accept that proposal?
Q7. If you can select one country to live, which country would you like to go?
Jamie Oliver: Time to get 'medieval' on sugar - Newsnight
Budget 2016: Sugar tax on soft drinks To Tackle Childhood Obesity
Jamie Oliver is jumping for joy.
George Osborne today announced a sugar tax on soft drinks in a bid to tackle the rise in childhood obesity.
The Chancellor revealed the tax would kick in from 2018 and money raised from the levy would fund sports activities in primary schools. The tax could see the price of a can of fizzy drink rise by 8p.
George Osborne today announced a sugar tax on soft drinks in a bid to tackle the rise in childhood obesity.
The Chancellor revealed the tax would kick in from 2018 and money raised from the levy would fund sports activities in primary schools. The tax could see the price of a can of fizzy drink rise by 8p.
Speaking from the Despatch Box as he delivered his Budget, Osborne said: "I am not prepared to look back at my time here in this Parliament, doing this job and say to my children's generation: 'I'm sorry. We knew there was a problem with sugary drinks. We knew it caused disease but we ducked the difficult decisions and we did nothing.'
"So today I can announce that we will introduce a new sugar levy on the soft drinks industry."
Labour leader Jeremy Corbyn welcomed the tax, saying it was needed to tackle the "grotesque" levels of sugar consumed by children. Chef Jamie Oliver, who has long campaigned on the issue, Tweeted his delight.
The tax itself will be levied on the drinks companies, who will be assessed on the volume of sugar-sweetened drinks they produce or import. There will be two bands - one for drinks with sugar above five grams per 100 millilitres, and a second for those with more than eight grams per 100 millilitres.
The Chancellor told the Commons that "five-year-old children are consuming their body weight in sugar every year" as he unveiled the policy. He added: "One of the biggest contributors to childhood obesity is sugary drinks. "A can of cola typically has nine teaspoons of sugar in it - some popular drinks have as many as 13."That can be more than double a child's recommended added sugar intake." A report by Public Health England produced last year recommended a 10%-20% tax on products with high sugar content.
In January, NHS Chief Executive Simon Stevens last month unveiled his own plans to impose a levy on sugary drinks and snacks in vending machines on NHS property. The British Medical Association today welcomed the announcement, and its Science Board chairperson Baroness Shelia Hollins said: “The Chancellor’s decision to introduce a new levy on excessive sugar in soft drinks is a welcome step forward and a move called for in the BMA’s recent Food for Thought report.
"This is an important initiative that could help to begin to address the obesity crisis amongst young children, although the delay in introducing it for two years is disappointing." Jamie Oliver threatened to make his sugar tax campaign “more ninja” against the Conservatives last month during an appearance on The Andrew Marr Show. He said that when he first campaigned on the issue "I was a lone voice, everyone was trying to make me look like a fruit cake" but now organisations from the BMA to cancer charities were on his side.
"Obesity costs more globally than all conflict on the planet. It is a war, it doesn't have a shoot out, it just slowly makes people ill, die young. Anyone you would trust your kids with is in support of this.
"I don’t mind not getting the tax if there’s something better."
While Labour welcomed the policy, Ukip's Director Policy Mark Reckless spoke out against the move. Sugar Tax On Soft Drinks To Tackle Childhood Obesity
Article source : http://www.huffingtonpost.co.uk/entry/budget-2016-sugar-tax-osborne_uk_56e96322e4b03fb88ede7b5f
Budget 2016: Sugar tax on soft drinks
In a shock move, George Osborne announces new taxes on sugary drinks, with funds raised
to go towards more sport in primary schools
By Laura Donnelly, Health Editor,
A sugar tax on soft drinks - adding around eight pence to a can of fizzy drink - is to be introduced amid Government pledges to save a generation from the toll of obesity. In a shock move which delighted health campaigners, but drew fury from industry, George Osborne said levies will be introduced in 2018, as the centre-piece of his budget.
The Chancellor said the move was necessary to “do the right thing for the next generation” - which on current trends will otherwise see half of boys and almost three quarters of girls overweight or obese. The new system would mean tax levels on sugary drinks are staggered, depending on sugar content.It means a standard can of Coca-Cola - costing around 70 pence – would have an 8 pence tax placed on it, while a can of Sprite would have an additional levy of 6 pence.
Although the plans are aimed at childhood obesity, there was some ridicule last night about the fact tonic water will be hit by the taxes. Meanwhile, fruit juices and milky drinks will be excluded, despite concerns that some high street chains are selling lattes and mocha drinks with more than 20 teaspoons of sugar a piece.
The plans come amid mounting concern about rising obesity levels, and lobbying from health campaigners for a sugar tax.
The average teenager now consumes three times the recommended daily sugar limit, with soft drinks the largest single source of added sugar. Until now, the Prime Minister had been thought to be reluctant to introduce such a tax, and the announcement yesterday sent share prices in in the soft drinks industry plummeting.
The cash raised - an estimated £520 million a year - will be spent on doubling funding for sport in primary schools, the Chancellor said.
He suggested the changes were a chance to save a generation,
“We cannot have a long-term plan for the country unless we have a long-term plan for our children’s healthcare,” Mr Osborne told the Commons.
“Five-year-old children are consuming their body weight in sugar every year. Experts predict that within a generation more than half of all boys and 70 per cent of girls could be overweight or obese,” he added, warning that obesity is costing the economy £27 billion a year - more than half the NHS pay-bill.
"I am not prepared to look back at my time here in this Parliament, doing this job and say to my children's generation 'I'm sorry. We knew there was a problem with sugary drinks. We knew it caused disease. But we ducked the difficult decisions and we did nothing,’” he said. Official recommendations are to receive no more than 5 per cent of energy intake from sugar. Latest figures show children are having three times as much, while adults are having more than twice the daily recommended limit.
Soft drinks - excluding fruit juice - are now the largest single source of sugar for children aged 11 to 18 years, providing 29 per cent of daily sugar intake, official data shows. On average, those who consume them one can a day.
The two tiers of taxes mean that a few soft drinks such as Tango and Lilt will be exempt, because their sugar content is below 5g per 100ml.
A number of drinks, including Fanta and Sprite, with between 5g and 8g added sugar per 100ml, which will be taxed at 18 pence per litre – the equivalent to 6 pence for a 330ml can.
But the highest tax of 24 pence per litre, or 8 pence per can, will will fall on dozens of the most popular drinks, including Pepsi, Coca-Cola and Ribena.
Chris Askew, Diabetes UK Chief Executive, said “It is really promising news that the Government has announced a tax on the soft drinks industry. We have been campaigning for this measure as we are all consuming too much sugar.
"This is contributing to the huge rise we are seeing in the numbers of people who are overweight and obese, and therefore at increased risk of Type 2 diabetes. There are already around 3.6 million people in the UK with Type 2 diabetes. This is already a huge health and economic burden for individuals and health systems."
Obese people are trapped in an altered reality where distances appear longer Photo: PA
Celebrity chef Jamie Oliver, who led a campaign for such levies to be introduce, immediately welcomed the announcement, describing it as a “profound move that will ripple around the world.” He said it was “bold, brave and logical”.
Simon Stevens, chief executive of NHS England, said: "This bold and welcome action will send a powerful signal and incentivise soft drinks companies to act on the health consequences of their products.
"Of the range of measures needed to prevent child obesity, this was one of the most sensitive, so to have it under our belts is highly welcome,” he added.
He said parents who banned sugary drinks could help their children to a healthier future.
"While no child needs a daily dose of sugary fizzy water, sadly soft drinks are now our children's largest single source of diabetes-inducing teeth-rotting excess sugar.
But shares in listed drinks firms dropped sharply on the London market after the sugar tax announcement. Irn Bru maker AG Barr, which also makes Tizer and St Clement's, fell 5 per cent, while Robinsons squash firm Britvic fell 3 per cent and Vimto maker Nichols plunged as much as 11 per cent.
And Jonathan Isaby, chief executive of the TaxPayers' Alliance, said: "It is ludicrous that the Chancellor decided to cave in to the demands of the High Priest of the Nanny State in the public health lobby and introduce a hugely regressive and entirely ineffective sugar tax. This will hit the poorest families hardest and all the evidence shows it simply won't work."
<< Q&A >>
Q1. What is the sugar tax trying to achieve?
Childhood obesity has risen sharply in recent decades, with one in four children overweight by the age of five.
Among those aged between 11 and 15, 37 per cent are either overweight or obese – a rise from 28 per cent in just two decades.
Q2. Why are we getting fatter?
Campaigners say the environment has become more “obesenogenic” – with a supersize culture, and increasing reliance on snacks and convenience foods. At the heart of this lies sugar. The average child now consumes three times as much sugar as the recommended limit of 5g a day, health officials say.
Q3. Where are all the calories coming from?
Soft drinks - excluding fruit juice - are now the largest single source of sugar for children aged 11 to 18 years, providing 29 per cent of daily sugar intake, official data shows.
Q4. Why will taxes change our habits?
Campaigners believe that a sugar tax on sugary drinks is a simple way to persuade the industry to change, or to persuade consumers – especially parents – to simply cut out high sugar drinks. By giving manufacturers notice of the levies – due to be introduced in 2018 – the Government says they’ve left the industry time to change the formulation of their wares, and reduce sugar content of some of their big sellers.
Ministers hope the move will mean that drinks companies cut down on the amount of sugar in drinks, or their portion sizes. If prices do go up, they hope that will nudge consumers towards lower sugar alternatives. The Office for Budget Responsibility (OBR) has estimated it could add 18-24p to the price of a litre of fizzy drink if the full cost is passed on to the consumer. But a Treasury spokesman said: "We are hoping that producers do not pass on to consumers the costs. They can obviously reformulate their products, which many of them have done,
Q5. How has the reaction been?
So far, health campaigners including celebrity chef Jamie Oliver and the head of the Health Service have been thrilled.
But the drinks industry has reacted with anger. The Food and Drink Federation dismissed the move as "a piece of political theatre" that would cost jobs and share prices in major manufacturers dropped sharply.
Q6. What happens next?
The changes are due to be introduced in April 2018. And later this year the Government is expected to publish its long-awaited Childhood Obesity Strategy. This could mean a clampdown on TV advertising of junk foods to children during family viewing, and restrictions on 2 for 1 promotions on such fare. But the changes announced yesterday are some of the most controversial and sweeping any Government in this country has attempted in decades.
Taxes around the world Mexico introduced a 10 per cent tax on sugar sweetened beverages in 2014 in a bid to reverse trends which have seen it overtake the United States to become the most obesity country in the world.
Research published earlier this year found it cut sales by 12 per cent in its first year of introduction.
They found that over the course of the year, the average person purchased 4.2 fewer litres of sugar-sweetened drinks. Meanwhile, there was a four per cent rise in sales of untaxed drinks – mainly due to more purchases of bottled plain water.
The Treasury said it has looked at the way sugar taxes have been used around the world and opted for a model similar to Hungary, where the introduction of a tax on companies has led to a 40 per cent reduction in levels of sugar in products.
But other attempts at restrictions have faltered, and become mired in controversy.
In 2012 the French government courted controversy over introduced a sugar tax of just 1 pence per can of drink.
Coca-Cola suspended a major investment in a “symbolic protest against a tax that punishes our company and stigmatises our products”.
An attempt, first proposed by then Mayor Michale Bloomberg, to ban supersize drinks in New York was later struck down by the state’s highest court in 2014.
And a fat tax in Denmark was repealed after just 15 months.
The levy on all food products which contained more than 2.3 per cent saturated fat, saw shoppers turning to budget brands of the same products, or even crossing borders for their weekly shop.
Article source : http://www.telegraph.co.uk/news/health/news/12195786/Budget-2016-Sugar-tax-on-soft-drinks.html
<Questions>
Q1. Do you know Chef Jamie Oliver, who has long campaigned on the issue? Have you ever watched his cooking program?
Q2. How do you think about Chef Jamie Oliver's campaign to suggest sugar tax agenda into the parliament while nobody listen to it? If you found any crucial and meaningful agenda, could you make efforts to fix it as he did?
Q3. How do you think about the George Osborne's sugar tax bill to decrease 'Childhood obesity'?
Q4. How do you think about the concept of fat tax enacted in some of European countries?
Q5. Would you allow your kid drink any beverage with lots of sugar in it?
Q6. Are Your Eating Habits A Priority Or A Preference?
Q7. Do you live a life full of preferences or do you live a life full of Priorities?
Q8. Sometimes we need sweet flavored food. Why do we crave sugar? Please explain the detailed situations.
Can We Create an Ethical Robot?
Without our social sense, an android will buy that last muffin,
and a driverless car might run over a child
By JERRY KAPLAN/ July 24, 2015 1:21 p.m. ET
As you try to imagine yourself cruising along in the self-driving car of the future, you may think first of the technical challenges: how an automated vehicle could deal with construction, bad weather or a deer in the headlights. But the more difficult challenges may have to do with ethics. Should your car swerve to save the life of the child who just chased his ball into the street at the risk of killing the elderly couple driving the other way? Should this calculus be different when it’s your own life that’s at risk or the lives of your loved ones?
Recent advances in artificial intelligence are enabling the creation of systems capable of independently pursuing goals in complex, real-world settings—often among and around people. Self-driving cars are merely the vanguard of an approaching fleet of equally autonomous devices. As these systems increasingly invade human domains, the need to control what they are permitted to do, and on whose behalf, will become more acute.
How will you feel the first time a driverless car zips ahead of you to take the parking spot you have been patiently waiting for? Or when a robot buys the last dozen muffins at Starbucks while a crowd of hungry patrons looks on? Should your mechanical valet be allowed to stand in line for you, or vote for you?
In the suburb where I live, downtown parking is limited to two hours during the day. The purpose of this rule is to broadly allocate a scarce resource and to promote the customer turnover critical to local businesses. Now imagine that I’m the proud owner of a fancy new autonomous car, capable of finding a spot and parking by itself. You might think that my car should be permitted to do anything that is legal for me to do—but in this case, should I be allowed to instruct it to repark itself every two hours?
Delegating my authority to the car undermines the intent of the law, precisely because it circumvents the cost intentionally imposed on me for the community’s greater good. We can certainly modify the rule to accommodate this new invention, but it is hard to see any general principles that we can apply across the board. We will need to examine each of our rules and adjust them on a case-by-case basis.
Then there is the problem of redesigning our public spaces. Within the next few decades, our stores, streets and sidewalks will likely be crammed with robotic devices fetching and delivering goods of every variety. How do we ensure that they respect the unstated conventions that people unconsciously follow when navigating in crowds?
A debate may erupt over whether we should share our turf with machines or banish them to separate facilities. Will it be “Integrate Our Androids!” or “Ban the Bots!”
And far more serious issues are on the horizon. Should it be permissible for an autonomous military robot to select its own targets? The current consensus in the international community is that such weapons should be under “meaningful human control” at all times, but even this seemingly sensible constraint is ethically muddled. The expanded use of such robots may reduce military and civilian casualties and avoid collateral damage. So how many people’s lives should be put at risk waiting for a human to review a robot’s time-critical kill decision?
Even if we can codify our principles and beliefs algorithmically, that won’t solve the problem. Simply programming intelligent systems to obey rules isn’t sufficient, because sometimes the right thing to do is to break those rules. Blindly obeying a posted speed limit of 55 miles an hour may be quite dangerous, for instance, if traffic is averaging 75, and you wouldn’t want your self-driving car to strike a pedestrian rather than cross a double-yellow centerline.
People naturally abide by social conventions that may be difficult for machines to perceive, much less follow. Finding the right balance between our personal interests and the needs of others—or society in general—is a finely calibrated human instinct, driven by a sense of fairness, reciprocity and common interest. Today’s engineers, racing to bring these remarkable devices to market, are ill-prepared to design social intelligence into a machine. Their real challenge is to create civilized robots for a human world.
—This essay is adapted from Mr. Kaplan’s new book, “Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence,” which will be published August 4 by Yale University Press
Article source: http://www.wsj.com/articles/can-we-create-an-ethical-robot-1437758519
How to build an ethical robot
Wednesday 16 March 2016
Many people assume that robots would have to be sentient before they could act ethically. But this is not the case, says Alan Winfield, Director of the Science Communication Unit at the University of the West of England.
“The robot behaves ethically not because it chooses to but because it’s programmed to do so,” he says. “We call it an ethical zombie.”
In this video for the World Economic Forum's IdeasLab series, Winfield poses the question: “If we can build even minimally ethical robots, are we morally compelled to do so?”
And with driverless cars just around the corner, it’s a question that we’re going to have to answer quite soon.
Article source : http://www.weforum.org/agenda/2016/03/how-to-build-an-ethical-robot
Robotics: Ethics of artificial intelligence
27 May 2015
Four leading researchers share their concerns and solutions
for reducing societal risks from intelligent machines.
Stuart Russell: Take a stand on AI weapons
Sabine Hauert: Shape the debate, don't shy from it
Russ Altman: Distribute AI benefits fairly
Manuela Veloso: Embrace a robot–human world
Stuart Russell: Take a stand on AI weapons
Professor of computer science, University of California, Berkeley
The artificial intelligence (AI) and robotics communities face an important ethical decision: whether to support or oppose the development of lethal autonomous weapons systems (LAWS).
Technologies have reached a point at which the deployment of such systems is — practically if not legally — feasible within years, not decades. The stakes are high: LAWS have been described as the third revolution in warfare, after gunpowder and nuclear arms.
Autonomous weapons systems select and engage targets without human intervention; they become lethal when those targets include humans. LAWS might include, for example, armed quad copters that can search for and eliminate enemy combatants in a city, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions.
Existing AI and robotics components can provide physical platforms, perception, motor control, navigation, mapping, tactical decision-making and long-term planning. They just need to be combined. For example, the technology already demonstrated for self-driving cars, together with the human-like tactical control learned by Deep-mind's DQN system, could support urban search-and-destroy missions.
Two US Defense Advanced Research Projects Agency (DARPA) programmes foreshadow planned uses of LAWS: Fast Lightweight Autonomy (FLA) and Collaborative Operations in Denied Environment (CODE). The FLA project will program tiny rotorcraft to manoeuvre unaided at high speed in urban areas and inside buildings. CODE aims to develop teams of autonomous aerial vehicles carrying out “all steps of a strike mission — find, fix, track, target, engage, assess” in situations in which enemy signal-jamming makes communication with a human commander impossible. Other countries may be pursuing clandestine programmes with similar goals.
International humanitarian law — which governs attacks on humans in times of war — has no specific provisions for such autonomy, but may still be applicable. The 1949 Geneva Convention on humane conduct in war requires any attack to satisfy three criteria: military necessity; discrimination between combatants and non-combatants; and proportionality between the value of the military objective and the potential for collateral damage. (Also relevant is the Martens Clause, added in 1977, which bans weapons that violate the “principles of humanity and the dictates of public conscience.”) These are subjective judgments that are difficult or impossible for current AI systems to satisfy.
The United Nations has held a series of meetings on LAWS under the auspices of the Convention on Certain Conventional Weapons (CCW) in Geneva, Switzerland. Within a few years, the process could result in an international treaty limiting or banning autonomous weapons, as happened with blinding laser weapons in 1995; or it could leave in place the status quo, leading inevitably to an arms race.
As an AI specialist, I was asked to provide expert testimony for the third major meeting under the CCW, held in April, and heard the statements made by nations and non-governmental organizations. Several countries pressed for an immediate ban. Germany said that it “will not accept that the decision over life and death is taken solely by an autonomous system”; Japan stated that it “has no plan to develop robots with humans out of the loop, which may be capable of committing murder” (see go.nature.com/fwric1).
The United States, the United Kingdom and Israel — the three countries leading the development of LAWS technology — suggested that a treaty is unnecessary because they already have internal weapons review processes that ensure compliance with international law.
Almost all states who are party to the CCW agree with the need for 'meaningful human control' over the targeting and engagement decisions made by robotic weapons. Unfortunately, the meaning of 'meaningful' is still to be determined.
The debate has many facets. Some argue that the superior effectiveness and selectivity of autonomous weapons can minimize civilian casualties by targeting only combatants. Others insist that LAWS will lower the threshold for going to war by making it possible to attack an enemy while incurring no immediate risk; or that they will enable terrorists and non-state-aligned combatants to inflict catastrophic damage on civilian populations.
LAWS could violate fundamental principles of human dignity by allowing machines to choose whom to kill — for example, they might be tasked to eliminate anyone exhibiting 'threatening behaviour'. The potential for LAWS technologies to bleed over into peacetime policing functions is evident to human-rights organizations and drone manufacturers.
In my view, the overriding concern should be the probable endpoint of this technological trajectory. The capabilities of autonomous weapons will be limited more by the laws of physics — for example, by constraints on range, speed and payload — than by any deficiencies in the AI systems that control them. For instance, as flying robots become smaller, their manoeuvrability increases and their ability to be targeted decreases. They have a shorter range, yet they must be large enough to carry a lethal payload — perhaps a one-gram shaped charge to puncture the human cranium. Despite the limits imposed by physics, one can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenceless. This is not a desirable future.
The AI and robotics science communities, represented by their professional societies, are obliged to take a position, just as physicists have done on the use of nuclear weapons, chemists on the use of chemical agents and biologists on the use of disease agents in warfare. Debates should be organized at scientific meetings; arguments studied by ethics committees; position papers written for society publications; and votes taken by society members. Doing nothing is a vote in favour of continued development and deployment.
Sabine Hauert: Shape the debate, don't shy from it
Lecturer in robotics, University of Bristol
Irked by hyped headlines that foster fear or overinflate expectations of robotics and artificial intelligence (AI), some researchers have stopped communicating with the media or the public altogether.
But we must not disengage. The public includes taxpayers, policy-makers, investors and those who could benefit from the technology. They hear a mostly one-sided discussion that leaves them worried that robots will take their jobs, fearful that AI poses an existential threat, and wondering whether laws should be passed to keep hypothetical technology 'under control'. My colleagues and I spend dinner parties explaining that we are not evil but instead have been working for years to develop systems that could help the elderly, improve health care, make jobs safer and more efficient, and allow us to explore space or beneath the oceans.
Experts need to become the messengers. Through social media, researchers have a public platform that they should use to drive a balanced discussion. We can talk about the latest developments and limitations, provide the big picture and demystify the technology. I have used social media to crowd-source designs for swarming nanobots to treat cancer. And I found my first PhD student through his nanomedicine blog.
The AI and robotics community needs thought leaders who can engage with prominent commentators such as physicist Stephen Hawking and entrepreneur–inventor Elon Musk and set the agenda at international meetings such as the World Economic Forum in Davos, Switzerland. Public engagement also drives funding. Crowdfunding for JIBO, a personal robot for the home developed by Cynthia Breazeal, at the Massachusetts Institute of Technology (MIT) in Cambridge, raised more than US$2.2 million.
There are hurdles. First, many researchers have never tweeted, blogged or made a YouTube video. Second, outreach is 'yet another thing to do', and time is limited. Third, it can take years to build a social-media following that makes the effort worthwhile. And fourth, engagement work is rarely valued in research assessments, or regarded seriously by tenure committees.
Training, support and incentives are needed. All three are provided by Robohub.org, of which I am co-founder and president. Launched in 2012, Robohub is dedicated to connecting the robotics community to the public. We provide crash courses in science communication at major AI and robotics conferences on how to use social media efficiently and effectively. We invite professional science communicators and journalists to help researchers to prepare an article about their work. The communicators explain how to shape messages to make them clear and concise and avoid pitfalls, but we make sure the researcher drives the story and controls the end result. We also bring video cameras and ask researchers who are presenting at conferences to pitch their work to the public in five minutes. The results are uploaded to YouTube. We have built a portal for disseminating blogs and tweets, amplifying their reach to tens of thousands of followers.
“Through social media, researchers have a public platform that they should use to drive a balanced discussion.”
I can list all the benefits of science communication, but the incentive must come from funding agencies and institutes. Citations cannot be the only measure of success for grants and academic progression; we must also value shares, views, comments or likes. MIT robotics researcher Rodney Brooks's classic 1986 paper on the 'subsumption architecture', a bio-inspired way to program robots to react to their environment, gathered nearly 10,000 citations in 30 years (R. Brooks IEEE J. Robot. Automat. 2, 14–23; 1986). A video of Sawyer, a robot developed by Brooks's company Rethink Robotics, received more than 60,000 views in one month (see go.nature.com/jqwfmz). Which has had more impact on today's public discourse?
Governments, research institutes, business-development agencies, and research and industry associations do welcome and fund outreach and science-communication efforts. But each project develops its own strategy, resulting in pockets of communication that have little reach.
In my view, AI and robotics stakeholders worldwide should pool a small portion of their budgets (say 0.1%) to bring together these disjointed communications and enable the field to speak more loudly. Special-interest groups, such as the Small Unmanned Aerial Vehicles Coalition that is promoting a US market for commercial drones, are pushing the interests of major corporations to regulators. There are few concerted efforts to promote robotics and AI research in the public sphere. This balance is badly needed.
A common communications strategy will empower a new generation of roboticists that is deeply connected to the public and able to hold its own in discussions. This is essential if we are to counter media hype and prevent misconceptions from driving perception, policy and funding decisions.
Russ Altman: Distribute AI benefits fairly
Professor of bioengineering, genetics, medicine and computer science, Stanford University
Artificial intelligence (AI) has astounding potential to accelerate scientific discovery in biology and medicine, and to transform health care. AI systems promise to help make sense of several new types of data: measurements from the 'omics' such as genomics, proteomics and metabolomics; electronic health records; and digital-sensor monitoring of health signs.
Clustering analyses can define new syndromes — separating diseases that were thought to be the same and unifying others that have the same underlying defects. Pattern-recognition technologies may match disease states to optimal treatments. For example, my colleagues and I are identifying groups of patients who are likely to respond to drugs that regulate the immune system on the basis of clinical and transcriptomic features.
In consultations, physicians might be able to display data from a 'virtual cohort' of patients who are similar to the one sitting next to them and use it to weigh up diagnoses, treatment options and the statistics of outcomes. They could make medical decisions interactively with such a system or use simulations to predict outcomes on the basis of the patient's data and that of the virtual cohort.
“AI technologies could exacerbate existing health-care disparities and create new ones.”
I have two concerns. First, AI technologies could exacerbate existing health-care disparities and create new ones unless they are implemented in a way that allows all patients to benefit. In the United States, for example, people without jobs experience diverse levels of care. A two-tiered system in which only special groups or those who can pay — and not the poor — receive the benefits of advanced decision-making systems would be unjust and unfair. It is the joint responsibility of the government and those who develop the technology and support the research to ensure that AI technologies are distributed equally.
Second, I worry about clinicians' ability to understand and explain the output of high-performance AI systems. Most health-care providers will not accept a complex treatment recommendation from a decision-support system without a clear description of how and why it was reached.
Unfortunately, the better the AI system, the harder it often is to explain. The features that contribute to probability-based assessments such as Bayesian analyses are straightforward to present; deep-learning networks, less so.
AI researchers who create the infrastructure and technical capabilities for these systems need to engage doctors, nurses, patients and others to understand how they will be used, and used fairly.
Manuela Veloso: Embrace a robot–human world
Professor of computer science, Carnegie Mellon University
Humans seamlessly integrate perception, cognition and action. We use our sensors to assess the state of the world, our brains to think and choose actions to achieve objectives, and our bodies to execute those actions. My research team is trying to build robots that are capable of doing the same — with artificial sensors (cameras, microphones and scanners), algorithms and actuators, which control the mechanisms.
But autonomous robots and humans differ greatly in their abilities. Robots may always have perceptual, cognitive and actuation limitations. They might not be able to fully perceive a scene, recognize or manipulate any object, understand all spoken or written language, or navigate in any terrain. I think that robots will complement humans, not supplant them. But robots need to know when to ask for help and how to express their inner workings.
To learn more about how robots and humans work together, for the past three years we have shared our laboratory and buildings with four collaborative robots, or CoBots, which we developed. The robots look a bit like mechanical lecterns. They have omnidirectional wheels that enable them to steer smoothly around obstacles; camera and lidar systems to provide depth vision; computers for processing; screens for communication; and a basket to carry things in.
Early on, we realized how challenging real environments are for robots. The CoBots cannot recognize every object they encounter; lacking arms or hands they struggle to open doors, pick things up or manipulate them. Although they can use speech to communicate, they may not recognize or understand the meaning of words spoken in response.
We introduced the concept of 'symbiotic autonomy' to enable robots to ask for help from humans or from the Internet. Now, robots and humans in our building aid one another in overcoming the limitations of each other.
CoBots escort visitors through the building or carry objects between locations, gathering useful information along the way. For example, they can generate accurate maps of spaces, showing temperature, humidity, noise and light levels, or WiFi signal strength. We help the robots to open doors, press lift buttons, pick up objects and follow dialogue by giving clarifications.
There are still hurdles to overcome to enable robots and humans to co-exist safely and productively. My team is researching how people and robots can communicate more easily through language and gestures, and how robots and people can better match their representations of objects, tasks and goals.
We are also studying how robot appearance enhances interactions, in particular how indicator lights may reveal more of a robot's inner state to humans. For instance, if the robot is busy, its lights may be yellow, but when it is available they are green.
Although we have a way to go, I believe that the future will be a positive one if humans and robots can help and complement each other.
Article Source : http://www.nature.com/news/robotics-ethics-of-artificial-intelligence-1.17611
<Questions>
Q1. Did you watch historic match between Lee Se-dol and AlphaGO? How did you feel about that?
Q2. Do you think android can be ethical?
Q3. Why do we need robots? Why do we need AI technology?
Q4. Do you think AI can surpass the ability of human being?
Q5. Did you watch the movie Bicentennial man? As described in this film, can we build up robots with humanitarian aspects?
Q6. What is the merits and demerits of Artificial intelligence?
|