Connect with us

AI

Forecasting for Fall Uncertainties 

Avatar

Published

on

Here is how supply chain executives can prepare for the onset of fall and winter as pandemic forces continue to impact the business. (Credit: Getty Images) 

By Scott Lundstrom, Analyst, Supply Chain Futures 

Over the last several months, the supply chain planning community has been faced with the question of how to deal with increased uncertainty as we enter the fall. While we are adjusting to COVID-19, we are not overcoming it. Pandemic forces will continue to impact our business as we enter the fall and move into winter. Widespread vaccine availability is still 9 to 12 months away for most people. Environmental and climate disruption challenges continue unabated. Political instability and challenges still dominate the front page.  

Scott Lundstrom, Analyst, Supply Chain Futures

Our relatively stable world of global supply chains has been upended in ways we could never imagine. What is a supply chain executive to do? While it might sound obvious at this point, COVID has impressed upon us all the need for digital transformation to drive resiliency and agility into our operations. First and foremost, we need to adopt an outside-in view of the supply chain. Viewing the supply chain as a demand-driven business network is essential to avoid execution failures, excess inventories, and the inevitable bullwhip effects of the chaotic business environment. AI and advanced supply chain and data analytics can help, but only if we have the data and processes required to make use of intelligence in creating agility and resiliency. 

Changes in philosophy and strategy – from efficiency to resiliency. This really has little to do with technology. Change management among senior leaders can be incredibly challenging but is an absolute necessity. Adopting a focus on outside-in thinking and customer experience can be difficult after many years of internal process optimization to reduce costs and minimize inventory. Analytics can play a role in gaining a better understanding of where we are experiencing difficulties, and disappointing customers 

Changes in sourcing agreements to improve supply stability and demand forecasting – Supply chain is a team sport. It is only by working with our partner suppliers that we can improve resiliency. Moves toward more flexible agreements that allow a range of order actions across multiple categories based on demand and availability will help make supply chains less brittle and restrictive. Partner data about tier 2 and tier 3 suppliers can help us improve our planning models to incorporate uncertainty in geopolitical, climate, logistic, and pandemic dimensions. Utilizing better, more detailed data about suppliers may be one of the most important changes we can make in improving the resilience of our planning optimization models. This is also essential data if we hope to utilize machine learning and auto ML in our planning models.  

Changes in logistics planning embracing flexibility and local supply – One of the biggest changes we will see in supply chains this fall is a desire to move toward more local sources of supply. Geographical complexities driven by lockdowns, limited global shipping capacity, and geopolitical instability are causing the pendulum to swing back toward more local sources of supply. 

Changes in supply and demand data requirements and digital twins – Real improvements in supply chain performance require more real time data. Real time data from customers, suppliers, distributors and logistic suppliers needs to be integrated to provide a real time view of the end-to-end process of meeting customer needs. Increasingly, supply chain software providers are turning to digital twin and digital thread data models to help provide this visibility. Advanced analytics and machine learning algorithms are ideally suited to identify and resolve issues when provided with this type of operating framework. Preparing for uncertainty and creating resilience should be a focus of every supply chain organization as we move into the next wave of pandemic uncertainty. Prepared organizations will experience much higher levels of customer satisfaction, and will experience better business outcomes and performance. 

Scott Lundstrom is an analyst focused on the intersection of AI, IoT and Supply Chains. See his blog at Supply Chain Futures. 

Source: https://www.aitrends.com/ai-in-industry/forecasting-for-fall-uncertainties/

AI

Flexible expressions could lift 3D-generated faces out of the uncanny valley

Avatar

Published

on

3D-rendered faces are a big part of any major movie or game now, but the task of capturing and animating them in a natural way can be a tough one. Disney Research is working on ways to smooth out this process, among them a machine learning tool that makes it much easier to generate and manipulate 3D faces without dipping into the uncanny valley.

Of course this technology has come a long way from the wooden expressions and limited details of earlier days. High-resolution, convincing 3D faces can be animated quickly and well, but the subtleties of human expression are not just limitless in variety, they’re very easy to get wrong.

Think of how someone’s entire face changes when they smile — it’s different for everyone, but there are enough similarities that we fancy we can tell when someone is “really” smiling or just faking it. How can you achieve that level of detail in an artificial face?

Existing “linear” models simplify the subtlety of expression, making “happiness” or “anger” minutely adjustable, but at the cost of accuracy — they can’t express every possible face, but can easily result in impossible faces. Newer neural models learn complexity from watching the interconnectedness of expressions, but like other such models their workings are obscure and difficult to control, and perhaps not generalizable beyond the faces they learned from. They don’t enable the level of control an artist working on a movie or game needs, or result in faces that (humans are remarkably good at detecting this) are just off somehow.

A team at Disney Research proposes a new model with the best of both worlds — what it calls a “semantic deep face model.” Without getting into the exact technical execution, the basic improvement is that it’s a neural model that learns how a facial expression affects the whole face, but is not specific to a single face — and moreover is nonlinear, allowing flexibility in how expressions interact with a face’s geometry and each other.

Think of it this way: A linear model lets you take an expression (a smile, or kiss, say) from 0-100 on any 3D face, but the results may be unrealistic. A neural model lets you take a learned expression from 0-100 realistically, but only on the face it learned it from. This model can take an expression from 0-100 smoothly on any 3D face. That’s something of an over-simplification, but you get the idea.

Image Credits: Disney Research

The results are powerful: You could generate a thousand faces with different shapes and tones, and then animate all of them with the same expressions without any extra work. Think how that could result in diverse CG crowds you can summon with a couple clicks, or characters in games that have realistic facial expressions regardless of whether they were hand-crafted or not.

It’s not a silver bullet, and it’s only part of a huge set of improvements artists and engineers are making in the various industries where this technology is employed — markerless face tracking, better skin deformation, realistic eye movements and dozens more areas of interest are also important parts of this process.

The Disney Research paper was presented at the International Conference on 3D Vision; you can read the full thing here.

Source: https://techcrunch.com/2020/11/25/flexible-expressions-could-lift-3d-generated-faces-out-of-the-uncanny-valley/

Continue Reading

AI

Europe sets out the rules of the road for its data reuse plan

Avatar

Published

on

European Union lawmakers have laid out a major legislative proposal today to encourage the reuse of industrial data across the Single Market by creating a standardized framework of trusted tools and techniques to ensure what they describe as “secure and privacy-compliant conditions” for sharing data.

Enabling a network of trusted and neutral data intermediaries, and an oversight regime comprised of national monitoring authorities and a pan-EU coordinating body, are core components of the plan.

The move follows the European Commission’s data strategy announcement in February, when it said it wanted to boost data reuse to support a new generation of data-driven services powered by data-hungry artificial intelligence, as well as encouraging the notion of using “tech for good” by enabling “more data and good quality data” to fuel innovation with a common public good (like better disease diagnostics) and improve public services.

The wider context is that personal data is already regulated in the bloc (such as under the General Data Protection Regulation; GDPR), which restricts reuse. While commercial considerations can limit how industrial data is shared.

The EU’s executive believes harmonzied requirements that set technical and/or legal conditions for data reuse are needed to foster legal certainty and trust — delivered via a framework that promises to maintain rights and protections and thus get more data usefully flowing.

The Commission sees major business benefits flowing from the proposed data governance regime. “Businesses, both small and large, will benefit from new business opportunities as well as from a reduction in costs for acquiring, integrating and processing data, from lower barriers to enter markets, and from a reduction in time-to-market for novel products and services,” it writes in a press release.

It has further data-related proposals incoming in 2021, in addition to a package of digital services legislation it’s due to lay out early next month — as part of a wider reboot of industrial strategy which prioritises digitalization and a green new deal.

All legislative components of the strategy will need to gain the backing of the European Council and parliament so there’s a long road ahead for implementing the plan.

Data Governance Act

EU lawmakers often talk in shorthand about the data strategy being intended to encourage the sharing and reuse of “industrial data” — although the Data Governance Plan (DGA) unveiled today has a wider remit.

The Commission envisages the framework enabling the sharing of data that’s subject to data protection legislation — which means personal data; where privacy considerations may (currently) restrain reuse — as well as industrial data subject to intellectual property, or which contains trade secrets or other commercially sensitive information (and is thus not typically shared by its creators primarily for commercial reasons). 

In a press conference on the data governance proposals, internal market commissioner Thierry Breton floated the notion of “data altruism” — saying the Commission wants to provide citizens with an organized way to share their own personal data for a common/public good, such as aiding research into rare diseases or helping cities map mobility for purposes like monitoring urban air quality.

“Through personal data spaces, which are novel personal information management tools and services, Europeans will gain more control over their data and decide on a detailed level who will get access to their data and for what purpose,” the Commission writes in a Q&A on the proposal.

It’s planning a public register where entities will be able to register as a “data altruism organisation” — provided they have a not-for-profit character; meet transparency requirements; and implement certain safeguards to “protect the rights and interests of citizens and companies” — with the aim of providing “maximum trust with minimum administrative burden”, as it puts it.

The DGA envisages different tools, techniques and requirements governing how private sector bodies share data versus private companies.

For public sector bodies there may be technical requirements (such as encryption or anonymization) attached to the data itself or further processing limitations (such as requiring it to take place in “dedicated infrastructures operated and supervised by the public sector”), as well as legally binding confidentiality agreements that must be signed by the reuser.

“Whenever data is being transferred to a reuser, mechanisms will be in place that ensure compliance with the GDPR and preserve the commercial confidentiality of the data,” the Commission’s PR says.

To encourage businesses to get on board with pooling their own data sets — for the promise of a collective economic upside via access to bigger volumes of pooled data — the plan is for regulated data intermediaries/marketplaces to provide “neutral” data-sharing services, acting as the “trusted” go-between/repository so data can flow between businesses.

“To ensure this neutrality, the data-sharing intermediary cannot exchange the data for its own interest (e.g. by selling it to another company or using it to develop their own product based on this data) and will have to comply with strict requirements to ensure this neutrality,” the Commission writes on this.

Under the plan, intermediaries’ compliance with data handling requirements would be monitored by public authorities at a national level.

But the Commission is also proposing the creation of a new pan-EU body, called the European Data Innovation Board, that would try to knit together best practices across Member States — in what looks like a mirror of the steering/coordinating role undertaken by the European Data Protection Board (which links up the EU’s patchwork of data protection supervisory authorities).

“These data brokers or intermediaries that will provide for data sharing will do that in a way that your rights are protected and that you have choices,” said EVP Margrethe Vestager, who heads up the bloc’s digital strategy, also speaking at today’s press conference.

“So that you can also have personal data spaces where your data is managed. Because, initially, when you ask people they say well actually we do want to share but we don’t really know how to do it. And this is not only the technicalities — it’s also the legal certainty that’s missing. And this proposal will provide that,” she added.

Data localization requirements — or not?

The commissioners faced a number of questions over the hot button issue of international data transfers.

Breton was asked whether the DGA will include any data localization requirements. He responded by saying — essentially — that the rules will bake in a series of conditions which, depending on the data itself and the intended destination, may mean that storing and processing the data in the EU is the only viable option.

“On data localization — what we do is to set a GDPR-type of approach, through adequacy decisions and standard contractual clauses for only sensitive data through a cascading of conditions to allow the international transfer under conditions and in full respect of the protected nature of the data. That’s really the philosophy behind it,” Breton said. “And of course for highly sensitive data [such as] in the public health domain it is necessary to be able to set further conditions, depending on the sensitivity, otherwise… Member States will not share them.”

“For instance it could be possible to limit the reuse of this data into public secure infrastructures so that companies will come to use the data but not keep them. It could be also about restricting the number of access in third countries, restricting the possibility to further transfer the data and if necessary also prohibiting the transfer to a third country,” he went on, adding that such conditions would be “in full respect” of the EU’s WTO obligations.

In a section of its Q&A that deals with data localization requirements, the Commission similarly dances around the question, writing: “There is no obligation to store and process data in the EU. Nobody will be prohibited from dealing with the partner of their choice. At the same time, the EU must ensure that any access to EU citizen’s personal data and certain sensitive data is in compliance with its values and legislative framework.”

At the presser, Breton also noted that companies that want to gain access to EU data that’s been made available for reuse will need to have legal representation in the region. “This is important of course to ensure the enforceability of the rules we are setting,” he said. “It is very important for us — maybe not for other continents but for us — to be fully compliant.”

The commissioners also faced questions about how the planned data reuse rules would be enforced — given ongoing criticism over the lack of uniformly vigorous enforcement of Europe’s data protection framework, GDPR.

“No rule is any good if not enforced,” agreed Vestager. “What we are suggesting here is that if you have a data-sharing service provider and they have notified themselves it’s then up to the authority with whom they have notified actually to monitor and to supervise the compliance with the different things that they have to live up to in order to preserve the protection of these legitimate interests — could be business confidentiality, could be intellectual property rights.

“This is a thing that we will keep on working on also in the future proposals that are upcoming — the Digital Services Act and the Digital Markets Act — but here you have sort of a precursor that the ones who receive the notification in Member States they will also have to supervise that things are actually in order.”

Also responding on the enforcement point, Breton suggested enforcement would be baked in up front, such as by careful control of who could become a data reuse broker.

“[Firstly] we are putting forward common rules and harmonized rules… We are creating a large internal market for data. The second thing is that we are asking Member States to create specific authorities to monitor. The third thing is that we will ensure coherence and enforcement through the European Data Innovation Board,” he said. “Just to give you an example… enforcement is embedded. To be a data broker you will need to fulfil a certain number of obligations and if you fulfil these obligations you can be a neutral data broker — if you don’t

Alongside the DGA, the Commission also announced an Intellectual Property Action Plan.

Vestager said this aims to build on the EU’s existing IP framework with a number of supportive actions — including financial support for SMEs involved in the Horizon Europe R&D program to file patents.

The Commission is also considering whether to reform the framework for filing standards essential patents. But in the short term Vestager said it would aim to encourage industry to engage in forums aimed at reducing litigation.

“One example could be that the Commission could set up an independent system of third party essentiality checks in view of improving legal certainty and reducing litigation costs,” she added of the potential reform, noting that protecting IP is an important component of the bloc’s industrial strategy.

Source: https://techcrunch.com/2020/11/25/europe-sets-out-the-rules-of-the-road-for-its-data-reuse-plan/

Continue Reading

AI

How Do You Differentiate AI From Automation?

Avatar

Published

on

A lot of us use the terms artificial intelligence (AI) and automation interchangeably to describe the technological take over in the human-operated processes, and a lot of us would stare blankly at the person who asks the difference between the two. I know I did when I was asked the same.

It has been common to use these words interchangeably, even in professional use, to describe the innovative advancement in the regular processes. However, in actuality, these terms are not as similar as you think them to be. There are huge differences between the intricacy levels of the two.

While automation means making software or hardware that can get things done automatically with minimal human intervention, artificial intelligence refers to making machines intelligent. Automation is suitable for automating the repetitive, daily tasks and may require minimum or no human intervention.

It is based on specific programming and rules. If an organization wishes to convert this automation into AI, it will need to power it with data. Such data is termed as big data and comprises of Machine Learning, graphs, and neural networks. The output of automation is specific; however, AI carries the risk of uncertainty just like a human brain.

AI and automation play a vital role in the modern workplace simultaneously due to the availability of vast data and rapid technological development. Although the Gartner Survey claims that more than 37% (one-third) organizations use artificial intelligence in some form, these digits do not consider the implementation complexities.

While it is true that both of these advancements make our work easy, many employees believe that AI and automation are here to take over their jobs. Job loss is an unavoidable phenomenon and is going to take place, automation, or not. However, the evolved job replacing the traditional one is going to be more engaging and productive compared to the outdated ones.

“Stephen Hawking said, “The development of full artificial intelligence could spell the end of the human race.” 

Automation

Look around. You’ll find yourself surrounded by the automated systems. The reason you don’t have to wait for long hours at the bank or the reason you don’t have to rewrite the same mail a thousand times is automation. Automation’s sole purpose is to let machines take over repetitive, tedious, and monotonous tasks.

The primary benefit of employing automation in your business processes is that it frees up the employees’ time, enabling them to focus on more critical tasks. These tasks are those that require personal skill or human judgment. The secondary benefit is the efficiency of business with reduced cost and a productive workforce.

Organizations are more open to adopting automated machinery despite its high installation charges because the machinery never requires a sick leave or a holiday. It always gets the work done on time, without a break.

The point to consider for its differentiation from AI is that the machines are all piloted by manual configuration. It is a conceptualized way of saying that you have to configure your automation system to suit your organization’s needs and requirements. It is nothing superior to a machine that has the smarts to follow orders.

Artificial Intelligence

We want machinery that could replicate the human thought process, but we do not wish to experience the real-life cases of Interstellar, The Matrix, or Wall-E. That is a precise summation of AI – assisting human life without taking control of it. It is a technology that mimics what a human can say, so and think but won’t be stirred by natural limitations like age and death.

Unlike automation, AI is not compatible with repetitive tasks or following orders. Its purpose is to seek patterns, learn from experience, and select the appropriate responses in certain situations without depending on human intervention or guidance.

“According to Market and Markets, by 2025, the AI industry will grow up to a $190 billion industry.” 

Differences In AI And Automation

As discussed above, people use the terms – AI and Automation – interchangeably. These terminologies have different objectives. AI’s main objective is to create brilliant machines to carry out the tasks that require intelligent thinking. It is the engineering and science of making devices so smart that they can mimic human behaviour and intelligence.

AI creates technology that enables computers and machines to think and behave like humans and learn from them. On the contrary, automation focuses on simplifying and speeding up the routine, repetitive tasks to increase the efficiency and quality of the output with minimum to no human intervention. Besides their terminology and objectives, AI and automation differ on the following basis.

Sr. No. Basis of Differentiation Automation Artificial Intelligence
1. Meaning It is a pre-set program that self runs to perform specific tasks. It is engineering the systems to have human-like thinking capability.
2. Purpose To help the employees by automizing the routine, repetitive, and monotonous processes to save time. To help the employees by making machines that can carry out the tasks that require human-like intelligent thinking and decision making.
3. Nature of tasks performed It performs repetitive, routine, and monotonous tasks. It performs more intelligent and critical tasks that require the thinking and judgment of the human brain.
4. Added Features It does not have highly exclusive added features. It involves self-learning and development from experiences.
5. Human Intervention It may require a little or least human intervention (to switch the system on/ off) It requires no human intervention as it takes the necessary information from the data and self-learns from the experiences or data feeds.

How Are AI And Automation Connected?

Now that we have seen what differentiates them from each other and have understood the meaning of each individually let’s see what similarities they hold.

There is one single thing that drives both AI and automation, and that is data. The automated devices collect and combine the data while the systems with artificial intelligence understand it. Indeed, the success or failure of a company depends on numerous factors like increased productivity, employees’ ability to contribute to the organization’s expansion, and business efficiency.

However, the factor more significant than any of these is data. The automated machinery relentlessly and obsessively feeds on the data. With artificial intelligence employed in the systems and automated machines chewing on the data, the companies can make smarter business decisions than before. 

The two systems are highly compatible. The businesses flourish at an entirely different level when these two are combined. Take the example of the modern-day cloud-based payroll solution. The software does the task of calculation or allocation of payroll with the help of programming. AI comes into the picture by taking the data of individual employees and passing it to the automated calculative system and taking the calculated amount and transferring the amount into individual employees’ accounts.

It coordinated with software like leave management or attendance management to maintain accuracy in the calculation process. The program calculates as it is programmed to do, without checking whether the data is correct or not. It is AI that sorts the information and gives relevant data to the program to calculate the payroll, thereby becoming the “brain” of the software.

A combination of AI and automation can birth a software that requires no human intervention and gives 100% accuracy and lawful compliance of the process. It is, however, just the tip of the iceberg. Imagine how powerful the organizations can become in the future by coupling the machines capable of collecting massive amounts of data with the systems that can brilliantly make that information meaningful.

“The famous futurologist and visionary and CEO of Tesla — Elon Musk — said that Robots and AI will be able to do everything better than us, creating the biggest risk that we face as a civilization.” 

Final Thoughts

When we think about how technology has changed everything about our lives, we observe that technologies such as automation and AI are becoming more dominant forms of brilliance and are plunging themselves into our environments. The fact – AI has transformed from being our assistant to something so powerful while the automated systems are swiftly outperforming us in many pursuits – is not bogus.

Technology is all about exploring the possibilities, and that is precisely what AI and automation serve. They explore new opportunities to outsmart the human brain. That being said, artificial intelligence is about making the machines smart to supersede human brilliance and behaviours, while automation simplifies and speeds up processes with minimal or zero human intervention.

Source: https://www.aiiottalk.com/artificial-intelligence/differentiate-ai-from-automation/

Continue Reading

AI

MIT Study: Effects of Automation on the Future of Work Challenges Policymakers  

Avatar

Published

on

The 2020 MIT Task Force on the Future of Work suggests how the productivity gains of automation can coexist with opportunity for low-wage workers. (Credit: Getty Images)

By John P. Desmond, AI Trends Editor  

Rising productivity brought on by automation has not led an increase in income for workers. This is among the conclusions of the 2020 report from the MIT Task Force on the Future of Work, founded in 2018 to study the relation between emerging technologies and work, to shape public discourse and explore strategies to enable a sharing of prosperity.  

Dr. Elisabeth Reynolds, Executive Director, MIT Task Force on the Work of the Future

“Wages have stagnated,” said Dr. Elisabeth Reynolds, Executive Director, MIT Task Force on the Work of the Future, who shared results of the new task force report at the AI and the Work of the Future Congress 2020 held virtually last week.   

The report made three areas of recommendations, the first around translating productivity gains from advances in automation to better quality jobs. “The quality of jobs in this country has been falling and not keeping up with those in other countries,” she said. Among rich countries, the US is among the worst places for the less educated and low-paid workers.” For example, the average hourly wage for low-paid workers in the US is $10/hour, compared to $14/hour for similar workers in Canada, who have health care benefits from national insurance. 

“Our workers are falling behind,” she said.  

The second area of recommendation was to invest and innovate in education and skills training. “This is a pillar of our strategy going forward,” Reynolds said. The report focuses on workers between high school and a four-year degree. “We focus on multiple examples to help workers find the right path to skilled jobs,” she said.  

Many opportunities are emerging in health care, for example, specifically around health information technicians. She cited the IBM P-TECH program, which provides public high school students from underserved backgrounds with skills they need for competitive STEM jobs, as a good example of education innovation. P-TECH schools enable students to earn both their high school diploma and a two year associate degree linked to growing STEM fields.  

The third area of innovation is to shape and expand innovation.  

“Innovation creates jobs and will help the US meet competitive challenges from abroad,” Reynolds said. R&D funding as a percent of GDP in the US has stayed fairly steady for states from 1953 to 2015, but support from the federal government has declined over that time. “We want to see greater activity by the US government,” she said. 

In a country that is politically divided and economically polarized, many have a fear of technology. Deploying new technology into the existing labor market has the potential to make such divisions worse, continuing downward pressure on wages, skills and benefits, and widening income inequality. “We reject the false tradeoffs between economic growth and having a strong labor market,” Dr. Reynolds said. “Other countries have done it better and the US can do it as well,” she said, noting many jobs that exist today did not exist 40 years ago. 

The COVID-19 crisis has exacerbated the different realities between low-paid workers deemed “essential” needing to be physically present to earn their livings, and higher-paid workers able to work remotely via computers, the report noted.   

The Task Force is co-chaired by MIT Professors David Autor, Economics, and David Mindell, Engineering, in addition to Dr. Reynolds. Members of the task force include more than 20 faculty members drawn from 12 departments at MIT, as well as over 20 graduate students. The 2020 Report can be found here.   

Low-Wage Workers in US Fare Less Well Than Those in Other Advanced Countries  

James Manyika, Senior Partner, McKinsey & Co.

In a discussion on the state of low-wage jobs, James Manyika, Senior Partner, McKinsey & Co., said low-wage workers have not fared well across the 37 countries of the Organization for Economic Cooperation and Development (OECD), “and in the US, they have fared far worse than in other advanced countries,” he said. Jobs are available, but the wages are lower and, “Work has become a lot more fragile,” with many jobs in the gig worker economy (Uber, Lyft for example) and not full-time jobs with some level of benefits. 

Addressing cost of living, Manyika said the cost of products such as cars and TVs have declined as a percentage of income, but costs of housing, education and health care have increased dramatically and are not affordable for many. The growth in the low-wage gig-worker type of job has coincided with “the disappearance of labor market protections and worker voice,” he said, noting, “The power of workers has declined dramatically.” 

Geographically, two-thirds of US job growth has happened in 25 metropolitan areas. “Other parts of the country have fared far worse,” he said. “This is a profound challenge.”  

In a session on Labor Market Dynamics, Susan Houseman, VP and Director of Research, W.E. Upjohn Institute for Employment Research, drew a comparison to Denmark for some contrasts. Denmark has a strong safety net of benefits for the unemployed, while the US has “one of the least generous unemployment systems in the world,” she said. “This will be more important in the future with the growing displacement caused by new technology.”  

Another contrast between the US and Denmark is the relationship of labor to management. “The Danish system has a long history of labor management cooperation, with two-thirds of Danish workers in a union,” she said. “In the US, unionization rates have dropped to 10%.”  

“We have a long history of labor management confrontation and not cooperation,” Houseman said. “Unions have really been weakened in the US.”  

As for recommendations, she suggested that the US strengthen its unemployment systems, help labor organizations to build, raise the federal minimum wage [Ed. Note: Federal minimum wage increased to $10/hour on Jan. 2, 2020, raised from $7.25/hour, which was set in 2009.], and provide universal health insurance, “to take it out of the employment market.” 

She suspects the number of workers designated as independent contractors is “likely understated” in the data.   

Jayaraman of One Fair Wage Recommends Sectoral Bargaining 

Saru Jayaraman, President One Fair Wage and Director, Food Labor Research Center, University of California, Berkeley

Later in the day, Saru Jayaraman, President One Fair Wage and Director, Food Labor Research Center, at the University of California, Berkeley, spoke about her work with employees and business owners. One Fair Wage is a non-profit organization that advocates for a fair minimum wage, including for example, a suggestion that tips be counted as a supplement to minimum wage for restaurant workers.  

“We fight for higher wages and better working conditions, but it’s more akin to sectoral bargaining in other parts of the world,” she said. Sectoral collective bargaining is an effort to reach an agreement covering all workers in a sector of the economy, as opposed to between workers for individual firms. “It is a social contract,” Jayaraman said.  

In France, 98% of workers were covered by sectoral bargaining as of 2015. ”The traditional models for improving wages and working conditions workplace by workplace do not work,” she said. She spoke of the need to maintain a “consumer base” of workers who put money back into the economy.   

With the pandemic causing many restaurants to scale back or close, more restaurant owners have reached out to her organization in an effort to get workers back with the help of more cooperative agreements. “We have been approached by hundreds of restaurants in the last six months who are saying it’s time to change to a minimum wage,” she said. “Many were moved that so many workers were not getting unemployment insurance. They are rethinking every aspect of their businesses. They want a more functional system where everybody gets paid, and we move away from slavery. It’s a sea change among employers.”   

She said nearly 800 restaurants are now members of her organization. 

For the future, “We don’t have time for each workplace to be organized. We need to be innovating with sectoral bargaining to raise wages and working conditions across sectors. That is the future combined with workplace organizing,” she said.  

Read the 2020 report from the MIT Task Force on the Future of Work; learn about IBM P-TECH and about One Fair Wage. 

Source: https://www.aitrends.com/workforce/mit-study-effects-of-automation-on-the-future-of-work-challenges-policymakers/

Continue Reading
Energy3 hours ago

Worldwide Hybrid Diesel Genset Industry to 2026 – Key Drivers and Restraints

Energy3 hours ago

The Neutrino Energy Group Transcends the Theoretical to Transform Practical Energy Use Worldwide

Energy4 hours ago

ChemPoint es seleccionado como distribuidor de los productos de Soluciones Especializadas de DuPont para México

Esports4 hours ago

Apex Legends Dev Believes Wattson Isn’t ‘Useless’

Esports4 hours ago

Horizon Voice Actor Shows What it Was Like to Record During the Pandemic

United States
Esports4 hours ago

Liquid edge past MAD Lions in BLAST Premier Fall Showdown

Energy5 hours ago

Europe Excavator Market Outlook Report 2020-2025 Featuring Prominent Players – Caterpillar, CNH, John Deere, Kobelco, Liebherr

Esports5 hours ago

Endpoint beat Sprout to win ESEA MDL Season 35, advance to ESL Pro League Season 13

Energy5 hours ago

$32.2 Billion Worldwide Soil Treatment Industry to 2027 – Impact of COVID-19 on the Market

Esports5 hours ago

Amazing Warzone Clip Shows Off Mind Games

Esports5 hours ago

Warzone Sniper Rifle Tier List November 2020

AR/VR5 hours ago

A Wake Inn Pulls Those Trailer Strings Ahead of a 2021 Release

Energy6 hours ago

Outlook on the Solar Street Lighting Global Market to 2030 – Industry Trends and Growth Forecast

Energy6 hours ago

Global $410 Million Flexible Battery Market to 2027 by Components, Capacity, Voltage, Chargeability, Materials, Technology, Applications & Competitive Landscape

Esports7 hours ago

AC Valhalla Update 1.04: 3 Biggest Takeaways

Esports7 hours ago

5 Best Goal Songs In FIFA 21

Esports7 hours ago

Bots Buying PS5: How Bad Was It?

Energy7 hours ago

North America $1792.6 Billion Excavator Market Outlook to 2025

AR/VR7 hours ago

Working Up a Sweat With FitXR’s Dance Workouts

Energy7 hours ago

Global Excavator Market Overview 2020-2025

Energy7 hours ago

Global Phosphate Markets,2019-2020 & Forecast to 2025 – Market is Driven by the Expanding Agricultural Industry Globally

AR/VR8 hours ago

Best VR Engines for Enterprise applications

Energy9 hours ago

Marelli to open production facility for electric vehicle drivetrains in Cologne, Germany

Energy9 hours ago

Turquoise Hill Comments on Letter from Short-Seller Odey Asset Management

AR/VR9 hours ago

Full VR Support Rolling out December for Microsoft Flight Simulator

Energy9 hours ago

Worldwide Platinum Industry to 2027 – by Source, Type, Application and Geography

Energy10 hours ago

The Worldwide Oil & Gas Pumps Industry is Expected to Reach $9 Billion by 2025

Energy11 hours ago

SIGU rompe paradigmas al introducir tornillo en Hardox® 500 Tuf para traslado de biomasa

AR/VR11 hours ago

Design and Build Robots With RoboCo in 2021, VR Support Expanded

Esports13 hours ago

NAF: “We want to pull off some big victories so that when we look back we can say we achieved something this year”

Trending