Connect with us

AI

France’s Shift Technology, an SaaS Provider of AI based Decision Automation for Insurance, Secures $220M via Series D

Published

on

France-based Shift Technology, an SaaS provider of AI-enhanced decision automation and optimization solutions for the insurance sector, recently revealed that it has finalized a $220 million Series D funding round.

Shift Technology‘s latest investment round brings total investment in the company to $320 million along with a market valuation of over $1 billion. This investment reportedly marks Advent’s sixth growth equity investment in 2021. Shift’s round was led by Advent International, via Advent Tech, along with contributions from Avenir and other investors.

Previous Series C investors Accel, Bessemer Venture Partners, General Catalyst, and Iris Capital also took part in Shift’s Series D round.

With this latest funding, Shift said it would use the capital to expand its business operations into the US, Europe, and Asia as well.

In the United States, the firm will be penetrating the property and casualty (P&C) insurance sector and will also expand into the health insurance industry, an area in which the company sees a great opportunity.

The funds raised by Shift Technology will also be used to support researach and development (R&D) work in the implementation of new solutions to cater to innovative decision automation and optimization needs for insurers.

Initially known for its fraud detection and claims automation solutions, in January 2021 Shift Technology launched its Insurance Suite to enable insurance providers to leverage AI-powered decision automation and optimization tech to a wider array of critical processes (related to policy lifecycle, including underwriting, subrogation, and compliance).

The firm currently serves over 100 clients in 25 countries and has reportedly analyzed almost 2 billion claims so far.

Thomas Weisman, a Director on Advent’s technology investment in London, stated:

“Since its founding in 2014, Shift has made a name for itself in the complex world of insurance.Shift’s advanced suite of SaaS products is helping insurers to reshape manual and often time-consuming claims processes in a safer and more automated way. We are proud to be part of this exciting company’s next wave of growth.”

Jeremy Jawish, CEO and co-founder, Shift Technology, remarked:

“We are thrilled to partner with Advent International, given their considerable sector expertise and global reach and are taking another giant step forward with this latest investment. We have only just scratched the surface of what is possible when AI-based decision automation and optimization is applied to the critical processes that drive the insurance policy lifecycle.”

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.crowdfundinsider.com/2021/05/175107-frances-shift-technology-an-saas-provider-of-ai-based-decision-automation-for-insurance-secures-220m-via-series-d/

Artificial Intelligence

KeepTruckin raises $190 million to invest in AI products, double R&D team to 700

Published

on

KeepTruckin, a hardware and software developer that helps trucking fleets manage vehicle, cargo and driver safety, has just raised $190 million in a Series E funding round, which puts the company’s valuation at $2 billion, according to CEO Shoaib Makani. 

G2 Venture Partners, which just raised a $500 million fund to help modernize existing industries, participated in the round, alongside existing backers like Greenoaks Capital, Index Ventures, IVP and Scale Venture Partners, which is managed by BlackRock. 

KeepTruckin intends to invest its new capital back into its AI-powered products like its GPS tracking, ELD compliance and dispatch and workflow, but it’s specifically interested in improving its smart dashcam, which instantly detects unsafe driving behaviors like cell phone distraction and close following and alerts the drivers in real time, according to Makani. 

The company says Usher Transport, one of its clients, says it has seen a 32% annual reduction in accidents after implementing the Smart Dashcam, DRIVE risk score and Safety Hub, products that the company offers to increase safety.

“KeepTruckin’s special sauce is that we can build complex models (that other edge cameras can’t yet run) and make it run on the edge with low-power, low-memory and low-bandwidth constraints,” Makani told TechCrunch. “We have developed in-house IPs to solve this problem at different environmental conditions such as low-light, extreme weather, occluded subject and distortions.”

This kind of accuracy requires billions of ground truth data points that are trained and tested on KeepTruckin’s in-house machine learning platform, a process that is very resource-intensive. The platform includes smart annotation capabilities to automatically label the different data points so the neural network can play with millions of potential situations, achieving similar performance to the edge device that’s in the field with real-world environmental conditions, according to Makani.

A 2020 McKinsey study predicted the freight industry is not likely to see the kind of YOY growth it saw last year, which was 30% up from 2019, but noted that some industries would increase at higher rates than others. For example, commodities related to e-commerce and agricultural and food products will be the first to return to growth, whereas electronics and automotive might increase at a slower rate due to declining consumer demand for nonessentials. 

Since the pandemic, the company said it experienced 70% annualized growth, in large part due to expansion into new markets like construction, oil and gas, food and beverage, field services, moving and storage and agriculture. KeepTruckin expects this demand to increase and intends to use the fresh funds to scale rapidly and recruit more talent that will help progress its AI systems, doubling its R&D team to 700 people globally with a focus on engineering, machine vision, data science and other AI areas, says Makani. 

“We think packaging these products into operator-friendly user interfaces for people who are not deeply technical is critical, so front-end and full-stack engineers with experience building incredibly intuitive mobile and web applications are also high priority,” said Makani. 

Much of KeepTruckin’s tech will eventually power autonomous vehicles to make roads safer, says Makani, something that’s also becoming increasingly relevant as the demand for trucking continues to outpace supply of drivers.

Level 4 and eventually level 5 autonomy will come to the trucking industry, but we are still many years away from broad deployment,” he said. “Our AI-powered dashcam is making drivers safer and helping prevent accidents today. While the promise of autonomy is real, we are working hard to help companies realize the value of this technology now.”

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://techcrunch.com/2021/06/17/keeptruckin-raises-190-million-to-invest-in-ai-products-double-rd-team-to-700/

Continue Reading

AI

Entrenched Data Culture Can Pose Challenge to New AI Systems 

Published

on

A legacy company may have an entrenched data culture, with established procedures that may have historically worked well, that make a move to AI systems challenging. (Credit: Getty Images) 

By John P. Desmond, AI Trends Editor 

Companies established for a long time—decades or even a century or more old—with thousands of employees in many business units globally, with information systems built over many years on multiple platforms, have entrenched data cultures that may pose challenges for implementing AI systems.  

Data culture refers to the expectation that data will be used to make decisions and optimize the business, making a company data-driven. A data-driven company can be rolling along peacefully, with complex business processes and operations under control and doing the job. Users may have access to the data they need and be encouraged to present their analysis, even if the insights are unwelcome.   

Then someone asks if the company can do it like Netflix or Amazon, with AI algorithms in the background making recommendations and guiding users along, like a Silicon Valley startup. Might not be able to get there from here.  

Tom O’Toole, professor, Kellogg School of Management

“These great companies may have built enormously successful and admirable businesses,” stated Tom O’Toole, professor at the Kellogg School of Management, writing recently in Forbes.  

However, many legacy companies have IT organization structures and systems that predate the user of data analytics and now AI. The data culture in place may be resistant to change. In many firms, culture is cited as a primary challenge to the successful implementation of AI.   

“Established organizations are too often fragmented, siloed, and parochial in their data use, with entrenched impediments to information sharing,” stated O’Toole, who before working in academia was chief marketing officer at United Airlines. Questions to established authority might not be welcome, especially if the top executive doesn’t like the answers. 

To replicate the Silicon Valley approach, the author had these suggestions:  

Get comfortable with transparency. Data that previously resides only within one department is likely to have to be shared more broadly across the leadership team. Business performance data needs to be transparent.  

Heighten accountability. Greater accountability follows increased transparency. Data needs to be provided to demonstrate that a particular strategy or product launch is effective.  

Embrace unwelcome answers. A data analysis can challenge conventional assumptions, for example by showing performance was less than had been believed, or that the conventional wisdom was not that smart.   

“Creating a data culture is an imperative for continuously advancing business performance and adopting AI and machine learning,” O’Toole stated. 

Survey Shows Concern that Data Quality Issues Will Cause AI to Fail 

Nearly 90% of respondents to a survey by Alation, a company that helps organizations form an effective data culture, are concerned that data quality issues can lead to AI failure.   

Aaron Kalb, cofounder and chief data and analytics officer, Alation

AI fails when it’s fed bad data, resulting in inaccurate or unfair results,” stated Aaron Kalb, cofounder and chief data and analytics officer, in an account on the Alation blogBad data, in turn, can stem from issues such as inconsistent data standards, data non-compliance, and a lack of data democratization, crowdsourcing, and cataloging.” Survey recipients cited these reasons as the main reasons for AI failures. 

The company’s latest survey asked organizations how they are deploying AI and what challenges they are facing doing so. The results showed a correlation between having a top-tier data culture and being more successful at implementing AI systems.  

Data leaders who have deployed AI cite incomplete data as the top issue that leads to AI failures. “This is because when you go searching for data to create the models—be it for product innovation, operational efficiency, or customer experience—you uncover questions around the accuracy, quality, redundancy, and comprehensiveness of the data,” Kalb stated.  

Aretec, a data science-focused firm that works to bring efficiency and automation to federal agencies, helps clients deal with legacy data by leveraging AI services themselves to integrate and optimize huge and diverse datasets.   

In a post on the Aretec blog, the issues they consistently see that impede the implementation of AI systems are:   

Data Fragmentation. Over time, the data needed to support operations winds up fragmented across multiple data silos. Some can be outside an agency or stored with private companies. Fragmented data eventually results in “islands” of duplicated and inconsistent data, incurring infrastructure support costs that are not necessary. 

Data inconsistencies. Many government agencies need to aggregate data records coming from a variety of sources, records not always in a consistent format or content. Even when rigid standards are applied, the standards are likely to evolve over time. The longer the records go back, the greater the chance for variance.  

Learning curves. Many challenges arising from legacy data management are cultural, not technical. Highly-skilled employees have spent years learning how to do their job efficiently and effectively. They may see any proposed change as compromising their position, thus having a negative impact on their productivity and morale.  

NewVantage Survey Find AI Investment Strong, Success Fleeting 

A newly-released survey from NewVantage Partners found that Fortune 1000 companies are investing heavily in data and AI initiatives, with 99% of firms reporting investments. However, the ninth annual update of the survey finds that companies are having difficulty maintaining the momentum, according to a recent account in the Harvard Business Review.  

Two significant trends were found from the 85 companies surveyed. First, companies that have steadily invested in Big Data and AI initiatives report that the pace of investment in those projects is accelerating, with 62% of firms reporting investments of greater than $50 million.   

The second major finding was that even committed companies struggle to derive value from their Big Data and AI investments and from the effort to become data-driven. “Often saddled with legacy data environments, business processes, skill sets, and traditional cultures that can be reluctant to change, mainstream companies appear to be confronting greater challenges as demands increase, data volumes grow, and companies seek to mature their data capabilities,” stated the author, Randy Bean, the CEO and founder of NewVantage Partners, who originated the survey.  

Only 24% of responding firms said they thought their organization was data-driven in the past year, a decline from 37.8% the year before. And 92% of firms reported that they continue to struggle with cultural challenges related to organization alignment, business processes, change management,, communication, people skills sets, resistance and a lack of the understanding needed to enable change.   

“Becoming data-driven takes time, focus, commitment, and persistence. Too many organizations minimize the effort,” stated Bean. 

One recommendation by the study authors was for companies to focus data initiatives on clearly-identified business problems or use cases with high impact.  

Read the source articles and information in Forbeson the Alation blog, on the Aretec blog and in the Harvard Business Review. 

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/ai-and-business-strategy/entrenched-data-culture-can-pose-challenge-to-new-ai-systems/

Continue Reading

AI

Drive AI into Pharma by Asking the Right Questions, Suggest DECODE Speakers 

Published

on

Speakers at the DECODE: AI for Pharmaceuticals forum emphasized the need to ask the right questions to help drive greater adoption. (Credit: Getty Images) 

By Allison Proffitt, Editorial Director, AI Trends   

At last week’s DECODE: AI for Pharmaceuticals forum, pharma leaders discussed the cultural challenges of AI in pharma and what steps their institutions are taking to better incorporate AI in the enterprise.   

Editor’s Note: Learn more about the DECODE event, read an interview with Dominie Roberts, Cambridge Innovation Institute Senior Event Director, and Emma Huang, Senior Director of Data Sciences External Innovation at Johnson & Johnson Innovation, and a member of the DECODE advisory board.   

First, Puneet Batra, director of machine learning at the Broad Institute, argued that pharma research—biology specifically—has a crucial role to play in driving AI and computing research. His work is part of the new Eric and Wendy Schmidt Center at the Broad, which has as its mission to position biology to drive the next era of computing.  

Puneet Batra, Director, Machine Learning, Broad Institute

Batra identified two great revolutions of the 21st Century: the explosion in data technologies (machine learning, cloud, etc.) as well as the blossoming of biological technologies (sequencing, single-cell genomes, medical imaging, etc.). These two revolutions are converging, but the goal is not simply to apply machine learning to biological questions.     

Machine learning, thus far, has been driven by image recognition and predictive accuracy, Batra pointed out. Machine learning needs to move from predictive accuracy to causal modeling, addressing “why” questions instead of only “what” questions. Biology and its unique biological questions should be a key driver to advances in computing.  

Biological questions come with some specific constraints that will shape new machine learning and computing strategies. Data aren’t available at unlimited scales, data reduction runs risks of losing biological complexity, and models applied in the clinic demand a heightened level of scrutiny. But Batra thinks these are the very drivers that should be shaping computing in the future. The goal, he said, is “to make the central questions biology needs to address, this causal aspect, this mechanistic aspect, to make those key needs drivers of additional advances in computing.”    

What Data, Which Problems    

The question-focused approach was a theme throughout the event. Start with a question in mind, several pharma leaders argued in a panel, instead of starting with the data at hand. People tend to focus first on data or algorithms, said Paul Bleicher, founder of PhaseForward, most recently at Optum Labs, and now principal at Evident Health Strategies.   

This approach misses the more fundamental question: What problem are you seeking to solve and how—if solved—will that create value or quality for the business and the patients. Only then, Bleicher said, you begin to ask: “What data would you need? Which of the datasets that we have access to can be used? When will that data potentially create bias? Where will it create issues? Once you have that all together, figure out what algorithms and the way you’ll put it all together.”    

This problem-first approach enables you to think clearly about how much—and what kind—of data you actually need and which tools you’ll use to process it. Be careful of spending all of your available time, money, and resources getting datasets so beautifully cleaned that there is no bandwidth left for using and acting on the data.    

Jacob Janey, Scientific Director, Bristol-Myers Squibb

Jacob Janey, scientific director, chemical and synthetic development at Bristol-Myers Squibb, argued for a minimum viable model approach to both the data needed and the algorithm chosen. Get “good enough” data, which will depend heavily on the question you are seeking to answer or the problem you hope to solve, he said. And then choose an analysis option that is sufficient for its purpose. “People tend to jump to deep learning or neural nets when sometimes it could be a simple regression or a simple random forest, which has its own benefits,” he said.   

Reimagining the AI Org Chart  

Reza Olfati-Saber, PhD, Global Head AI & Deep Analytics, Digital & Data Science R&D, at Sanofi outlined the organizational structure that will undergird a true AI-enabled pharma company. He proposed a pyramid architecture with computing (cloud, infrastructure) as its wide base, advancing through applications (data storage, app development, security), data (data governance and security), analytics (data analytics and visualization), machine learning, and finally AI policy (quality and ethics).   

Olfati-Saber argues that pharma’s data and AI enterprise should be led by a top digital expert and a top AI expert working together. It is “practically impossible” to expect a Chief Data Officer to know the entire pyramid well enough to facilitate a digital transformation, he said. The tag-team approach is essential. “Anything else wouldn’t do the job,” he said.   

Sessions from DECODE: AI for Pharmaceuticals, are now available on demand. 

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/ai-and-pharma/drive-ai-into-pharma-by-asking-the-right-questions-suggest-decode-speakers/

Continue Reading

AI

Fastly Outage Holds Lessons for CDNs and Website Resiliency  

Published

on

The recent Fastly outage drew attention to the role of content data networks in keeping the internet infrastructure resilient. (Credit: Getty Images) 

By John P. Desmond, AI Trends Editor  

On the morning of Tuesday, June 8, many websites went down after an outage at the cloud service firm Fastly, a content data network (CDN) provider.  

Sites affected included Amazon, Hulu, The New York Times, CNN, the Guardian, Bloomberg News, The Financial Times and the Verge. Also affected were the Reddit, Pinterest and Twitch platforms.   

In a post on the Fastly blog on the day of the outage, Nick Rockwell, the company’s senior VP of engineering and infrastructure, stated that a bug was introduced by the company’s own developers by mistake in a software update, and that bug was triggered when a customer modified a CDN configuration, which is a routine procedure.   

Nick Rockwell, Senior VP of Engineering and Infrastructure, Fastly

“We detected the disruption within one minute, then identified and isolated the cause, and disabled the configuration. Within 49 minutes, 95% of our network was operating as normal,” stated Rockwell, who was apologetic. “This outage was broad and severe, and we’re truly sorry for the impact to our customers and everyone who relies on them.” 

The damage radius was wide, causing alarm until the cause of the outage was made known and service started to be restored.  

‘Cascading Failure’ Results from Bug in a Software Update 

Complex cloud-based systems with many dependencies pose risks, especially when things go wrong. “You can end up with these cascading failures,” stated Christopher Meiklejohn, a PhD student at Carnegie Mellon’s Institute for Software Research, in an account from Vox. “They’re difficult to debug. They’re stressful and difficult to resolve. And they can be very difficult to detect early on when you’re thinking about making that change, because the systems are so complex, and they involve so many moving parts.” 

The vast systems of CDNs like Fastly, which is one of many, can involve thousands of servers deployed around the world, Meiklejohn stated, making it more likely an outage will be widespread if an error is introduced in the core software. The fact the bug was missed by Fastly’s quality control process is embarrassing for the company. “We’ll figure out why we didn’t detect the bug during our software quality assurance and testing processes,” Rockwell stated in his post. 

The Vox account likened the Fastly outage to one in 2011 when an Amazon cloud computing system, Elastic Block Store, crashed and took Reddit, Quora and Foursquare offline. After the incident, Amazon stated that one of its engineers inadvertently caused a technical problem that traveled throughout its systems and caused the outage.  

The Fastly outage was referred to as an “object lesson in internet fallibility” in an account in The Financial TimesThe writer of the account stated, “The failure is a reminder that ‘bugs’ lie buried in all new software programs. Maybe artificial intelligence will one day be able to anticipate and fix all the situations in which a piece of software can fail.” 

The CDNs move content closer to users, which improves response times, CDN services include web caching, request routing and server-load balancing, to reduce load times and improve website performance, according to an account from g2which guides users in the selection of software and services.   

Companies that use CDNs include online video streaming providers and e-commerce companies whose services are adversely affected by poor performance. CDN services are often used in conjunction with website hosting services to optimize content delivery speeds.  

Customers have many options for which CDN to employ. G2 listed over 100 CDNs in its account. Fastly was in the top 10, which also included Cloudflare, CloudFront, KeyCDN, Microsoft Azure CDN and Google Cloud CDN. 

Companies with Multiple CDNs Were Able to Shift Workloads 

Some Fastly customers were able to minimize the impact of the outage by shifting workloads to alternate providers, according to an account from ThousandEyesa network intelligence company. The CDNs provide distributed local delivery, without which streaming media services would not be able to provide high quality digital experiences, for example.   

Most CDNs today offer advanced security functionality, and are able to block common malicious traffic, as well as large scale denial of service attacks. Fundamentally, the CDN’s perform two functions: deliver content from their edge nodes to end users and fetch dynamic content from the site origin to deliver at the edge, according to the ThousandEyes account.   

Many popular high-volume sites use more than one CDN provider to deliver content to users, primarily for redundancy but also for optimizing performance. This is done for example by load balancing user requests across multiple CDNs. 

Angelique Medina, Director of Product Marketing, ThousandEyes

“How a site or application owner chooses to architect its content delivery can determine the severity of impact of an outage like the one Fastly experienced,” stated the account’s author, Angelique Medina, Director of Product Marketing for ThousandEyes. “Some of Fastly’s customers had resilient delivery architectures or they were able to take action to mitigate the impact of the incident—leading to very different outcomes for their users,” she noted.  

The company examined the experience of four companies in detail. The New York Times and Reddit each used Fastly’s service as the sold CDN for their primary domains, but the two firms had different experiences. Beginning at 9:50 UTC (5:50 am ET), Reddit was down from around the globe; service was restored about an hour later. 

The New York Times in contact temporarily redirected users to the site’s origin servers hosted on Google Cloud Platform, reducing the downtime of its service for users. The beginning of the outage was similar to the experience of Reddit, but 40 minutes into the outage, the service “significantly increased,” well before Fastly implemented a fix. By 10:50 UTC, no Fastly servers were in the delivery path for the NYT.  

After Fastly implemented its fix, just before 10:50 UTC, the NYTimes users were redirected back to the Fastly servers. By 11:30 UTC, the site was returned to its pre-outage state.  

Amazon uses three CDNs to deliver its site, load balancing traffic across each to deliver the best possible experience to its users. Amazon has its own CDN service, Cloudfront, that is part of its AWS offerings. Amazon also uses Akamai and Fastly to host its site.  

An example of one CDN vantage point showed it targeting Amazon’s site and being directed to a Fastly server just after 8:00 UTC. A few minutes later, it was directed to an Akamai server, and less than 10 minute later it was switched over to an Amazon server. “This active allocation of users across multiple CDN services is part of normal operations for Amazon,” Medina stated. 

Amazon eventually steered users to site components hosted by its own CDN and others, such as Akamai and EdgeCast. By approximately 10:40 UTC, site loading issues had been resolved for most Amazon users. 

Read the source articles and information on the Fastly blog, from Vox, in The Financial Timesfrom g2 and from ThousandEyes.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/infrastructure-for-ai/fastly-outage-holds-lessons-for-cdns-and-website-resiliency/

Continue Reading
Esports1 day ago

World of Warcraft 9.1 Release Date: When is it?

Aviation5 days ago

Delta Air Lines Flight Diverts To Oklahoma Over Unruly Off-Duty Flight Attendant

Aviation5 days ago

Spirit Airlines Just Made The Best Argument For Lifting LaGuardia’s Perimeter Rule

Esports3 days ago

Clash of Clans June 2021 Update patch notes

Blockchain4 days ago

Africa Leading Bitcoin P2P Trading Volume Growth in 2021

Aviation4 days ago

Boeing 727 Set To Be Turned Into Luxury Hotel Experience

Gaming5 days ago

Forza Horizon 5 Announced, Launches November 9

Big Data4 days ago

In El Salvador’s bitcoin beach town, digital divide slows uptake

HRTech3 days ago

Pre-Owned Luxury Car dealer Luxury Ride to add 80 Employees across functions to boost growth

Blockchain5 days ago

Ripple price analysis: Ripple retests $0.80 support, prepares to push higher? 

Blockchain5 days ago

Binance Is Launching a Decentralized NFT Platform

Blockchain3 days ago

Since It Adopted Bitcoin As Legal Tender, The World Is Looking At El Salvador

Blockchain5 days ago

Dogecoin Breaches More Demand Zones as Sellers Threaten To Short Further

Blockchain5 days ago

Digital turns physical: Top NFT galleries to visit in-person in 2021

Blockchain2 days ago

Former PayPal Employees Launch Cross-Border Payment System

Energy2 days ago

XCMG dostarcza ponad 100 sztuk żurawi dostosowanych do regionu geograficznego dla międzynarodowych klientów

Aviation4 days ago

Delta Air Lines Airbus A320 Returns To Minneapolis After Multiple Issues

Blockchain5 days ago

DeFi Deep Dive — Avalanche, DeFi in Under a Second

Blockchain5 days ago

Litecoin price analysis: Litecoin price ready to challenge the $160 mark despite bearish pressure

Gaming4 days ago

Her Story Creator’s Next Game is Immortality, Releases in 2022

Trending