Connect with us

Artificial Intelligence

Longevity startup Gero AI has a mobile API for quantifying health changes

Published

on

Sensor data from smartphones and wearables can meaningfully predict an individual’s ‘biological age’ and resilience to stress, according to Gero AI.

The ‘longevity’ startup — which condenses its mission to the pithy goal of “hacking complex diseases and aging with Gero AI” — has developed an AI model to predict morbidity risk using ‘digital biomarkers’ that are based on identifying patterns in step-counter sensor data which tracks mobile users’ physical activity.

A simple measure of ‘steps’ isn’t nuanced enough on its own to predict individual health, is the contention. Gero’s AI has been trained on large amounts of biological data to spots patterns that can be linked to morbidity risk. It also measures how quickly a personal recovers from a biological stress — another biomarker that’s been linked to lifespan; i.e. the faster the body recovers from stress, the better the individual’s overall health prognosis.

A research paper Gero has had published in the peer-reviewed biomedical journal Aging explains how it trained deep neural networks to predict morbidity risk from mobile device sensor data — and was able to demonstrate that its biological age acceleration model was comparable to models based on blood test results.

Another paper, due to be published in the journal Nature Communications later this month, will go into detail on its device-derived measurement of biological resilience.

The Singapore-based startup, which has research roots in Russia — founded back in 2015 by a Russian scientist with a background in theoretical physics — has raised a total of $5 million in seed funding to date (in two tranches).

Backers come from both the biotech and the AI fields, per co-founder Peter Fedichev. Its investors include Belarus-based AI-focused early stage fund, Bulba Ventures (Yury Melnichek). On the pharma side, it has backing from some (unnamed) private individuals with links to Russian drug development firm, Valenta. (The pharma company itself is not an investor).

Fedichev is a theoretical physicist by training who, after his PhD and some ten years in academia, moved into biotech to work on molecular modelling and machine learning for drug discovery — where he got interested in the problem of ageing and decided to start the company.

As well as conducting its own biological research into longevity (studying mice and nematodes), it’s focused on developing an AI model for predicting the biological age and resilience to stress of humans — via sensor data captured by mobile devices.

“Health of course is much more than one number,” emphasizes Fedichev. “We should not have illusions about that. But if you are going to condense human health to one number then, for a lot of people, the biological age is the best number. It tells you — essentially — how toxic is your lifestyle… The more biological age you have relative to your chronological age years — that’s called biological acceleration — the more are your chances to get chronic disease, to get seasonal infectious diseases or also develop complications from those seasonal diseases.”

Gero has recently launched a (paid, for now) API, called GeroSense, that’s aimed at health and fitness apps so they can tap up its AI modelling to offer their users an individual assessment of biological age and resilience (aka recovery rate from stress back to that individual’s baseline).

Early partners are other longevity-focused companies, AgelessRx and Humanity Inc. But the idea is to get the model widely embedded into fitness apps where it will be able to send a steady stream of longitudinal activity data back to Gero, to further feed its AI’s predictive capabilities and support the wider research mission — where it hopes to progress anti-ageing drug discovery, working in partnerships with pharmaceutical companies.

The carrot for the fitness providers to embed the API is to offer their users a fun and potentially valuable feature: A personalized health measurement so they can track positive (or negative) biological changes — helping them quantify the value of whatever fitness service they’re using.

“Every health and wellness provider — maybe even a gym — can put into their app for example… and this thing can rank all their classes in the gym, all their systems in the gym, for their value for different kinds of users,” explains Fedichev.

“We developed these capabilities because we need to understand how ageing works in humans, not in mice. Once we developed it we’re using it in our sophisticated genetic research in order to find genes — we are testing them in the laboratory — but, this technology, the measurement of ageing from continuous signals like wearable devices, is a good trick on its own. So that’s why we announced this GeroSense project,” he goes on.

“Ageing is this gradual decline of your functional abilities which is bad but you can go to the gym and potentially improve them. But the problem is you’re losing this resilience. Which means that when you’re [biologically] stressed you cannot get back to the norm as quickly as possible. So we report this resilience. So when people start losing this resilience it means that they’re not robust anymore and the same level of stress as in their 20s would get them [knocked off] the rails.

“We believe this loss of resilience is one of the key ageing phenotypes because it tells you that you’re vulnerable for future diseases even before those diseases set in.”

“In-house everything is ageing. We are totally committed to ageing: Measurement and intervention,” adds Fedichev. “We want to building something like an operating system for longevity and wellness.”

Gero is also generating some revenue from two pilots with “top range” insurance companies — which Fedichev says it’s essentially running as a proof of business model at this stage. He also mentions an early pilot with Pepsi Co.

He sketches a link between how it hopes to work with insurance companies in the area of health outcomes with how Elon Musk is offering insurance products to owners of its sensor-laden Teslas, based on what it knows about how they drive — because both are putting sensor data in the driving seat, if you’ll pardon the pun. (“Essentially we are trying to do to humans what Elon Musk is trying to do to cars,” is how he puts it.)

But the nearer term plan is to raise more funding — and potentially switch to offering the API for free to really scale up the data capture potential.

Zooming out for a little context, it’s been almost a decade since Google-backed Calico launched with the moonshot mission of ‘fixing death’. Since then a small but growing field of ‘longevity’ startups has sprung up, conducting research into extending (in the first instance) human lifespan. (Ending death is, clearly, the moonshot atop the moonshot.) 

Death is still with us, of course, but the business of identifying possible drugs and therapeutics to stave off the grim reaper’s knock continues picking up pace — attracting a growing volume of investor dollars.

The trend is being fuelled by health and biological data becoming ever more plentiful and accessible, thanks to open research data initiatives and the proliferation of digital devices and services for tracking health, set alongside promising developments in the fast-evolving field of machine learning in areas like predictive healthcare and drug discovery.

Longevity has also seen a bit of an upsurge in interest in recent times as the coronavirus pandemic has concentrated minds on health and wellness, generally — and, well, mortality specifically.

Nonetheless, it remains a complex, multi-disciplinary business. Some of these biotech moonshots are focused on bioengineering and gene-editing — pushing for disease diagnosis and/or drug discovery.

Plenty are also — like Gero —  trying to use AI and big data analysis to better understand and counteract biological ageing, bringing together experts in physics, maths and biological science to hunt for biomarkers to further research aimed at combating age-related disease and deterioration.

Another recent example is AI startup Deep Longevity, which came out of stealth last summer — as a spinout from AI drug discovery startup Insilico Medicine — touting an AI ‘longevity as a service’ system which it claims can predict an individual’s biological age “significantly more accurately than conventional methods” (and which it also hopes will help scientists to unpick which “biological culprits drive aging-related diseases”, as it put it).

Gero AI is taking a different tack toward the same overarching goal — by honing in on data generated by activity sensors embedded into the everyday mobile devices people carry with them (or wear) as a proxy signal for studying their biology.

The advantage being that it doesn’t require a person to undergo regular (invasive) blood tests to get an ongoing measure of their own health. Instead our personal device can generate proxy signals for biological study passively — at vast scale and low cost. So the promise of Gero’s ‘digital biomarkers’ is they could democratize access to individual health prediction.

And while billionaires like Peter Thiel can afford to shell out for bespoke medical monitoring and interventions to try to stay one step ahead of death, such high end services simply won’t scale to the rest of us.

If its digital biomarkers live up to Gero’s claims, its approach could, at the least, help steer millions towards healthier lifestyles, while also generating rich data for longevity R&D — and to support the development of drugs that could extend human lifespan (albeit what such life-extending pills might cost is a whole other matter).

The insurance industry is naturally interested — with the potential for such tools to be used to nudge individuals towards healthier lifestyles and thereby reduce payout costs.

For individuals who are motivated to improve their health themselves, Fedichev says the issue now is it’s extremely hard for people to know exactly which lifestyle changes or interventions are best suited to their particular biology.

For example fasting has been shown in some studies to help combat biological ageing. But he notes that the approach may not be effective for everyone. The same may be true of other activities that are accepted to be generally beneficial for health (like exercise or eating or avoiding certain foods).

Again those rules of thumb may have a lot of nuance, depending on an individual’s particular biology. And scientific research is, inevitably, limited by access to funding. (Research can thus tend to focus on certain groups to the exclusion of others — e.g. men rather than women; or the young rather than middle aged.)

This is why Fedichev believes there’s a lot of value in creating a measure than can address health-related knowledge gaps at essentially no individual cost.

Gero has used longitudinal data from the UK’s biobank, one of its research partners, to verify its model’s measurements of biological age and resilience. But of course it hopes to go further — as it ingests more data. 

“Technically it’s not properly different what we are doing — it just happens that we can do it now because there are such efforts like UK biobank. Government money and also some industry sponsors money, maybe for the first time in the history of humanity, we have this situation where we have electronic medical records, genetics, wearable devices from hundreds of thousands of people, so it just became possible. It’s the convergence of several developments — technological but also what I would call ‘social technologies’ [like the UK biobank],” he tells TechCrunch.

“Imagine that for every diet, for every training routine, meditation… in order to make sure that we can actually optimize lifestyles — understand which things work, which do not [for each person] or maybe some experimental drugs which are already proved [to] extend lifespan in animals are working, maybe we can do something different.”

“When we will have 1M tracks [half a year’s worth of data on 1M individuals] we will combine that with genetics and solve ageing,” he adds, with entrepreneurial flourish. “The ambitious version of this plan is we’ll get this million tracks by the end of the year.”

Fitness and health apps are an obvious target partner for data-loving longevity researchers — but you can imagine it’ll be a mutual attraction. One side can bring the users, the other a halo of credibility comprised of deep tech and hard science.

“We expect that these [apps] will get lots of people and we will be able to analyze those people for them as a fun feature first, for their users. But in the background we will build the best model of human ageing,” Fedichev continues, predicting that scoring the effect of different fitness and wellness treatments will be “the next frontier” for wellness and health (Or, more pithily: “Wellness and health has to become digital and quantitive.”)

“What we are doing is we are bringing physicists into the analysis of human data. Since recently we have lots of biobanks, we have lots of signals — including from available devices which produce something like a few years’ long windows on the human ageing process. So it’s a dynamical system — like weather prediction or financial market predictions,” he also tells us.

“We cannot own the treatments because we cannot patent them but maybe we can own the personalization — the AI that personalized those treatments for you.”

From a startup perspective, one thing looks crystal clear: Personalization is here for the long haul.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://techcrunch.com/2021/05/07/longevity-startup-gero-ai-has-a-mobile-api-for-quantifying-health-changes/

Artificial Intelligence

KeepTruckin raises $190 million to invest in AI products, double R&D team to 700

Published

on

KeepTruckin, a hardware and software developer that helps trucking fleets manage vehicle, cargo and driver safety, has just raised $190 million in a Series E funding round, which puts the company’s valuation at $2 billion, according to CEO Shoaib Makani. 

G2 Venture Partners, which just raised a $500 million fund to help modernize existing industries, participated in the round, alongside existing backers like Greenoaks Capital, Index Ventures, IVP and Scale Venture Partners, which is managed by BlackRock. 

KeepTruckin intends to invest its new capital back into its AI-powered products like its GPS tracking, ELD compliance and dispatch and workflow, but it’s specifically interested in improving its smart dashcam, which instantly detects unsafe driving behaviors like cell phone distraction and close following and alerts the drivers in real time, according to Makani. 

The company says Usher Transport, one of its clients, says it has seen a 32% annual reduction in accidents after implementing the Smart Dashcam, DRIVE risk score and Safety Hub, products that the company offers to increase safety.

“KeepTruckin’s special sauce is that we can build complex models (that other edge cameras can’t yet run) and make it run on the edge with low-power, low-memory and low-bandwidth constraints,” Makani told TechCrunch. “We have developed in-house IPs to solve this problem at different environmental conditions such as low-light, extreme weather, occluded subject and distortions.”

This kind of accuracy requires billions of ground truth data points that are trained and tested on KeepTruckin’s in-house machine learning platform, a process that is very resource-intensive. The platform includes smart annotation capabilities to automatically label the different data points so the neural network can play with millions of potential situations, achieving similar performance to the edge device that’s in the field with real-world environmental conditions, according to Makani.

A 2020 McKinsey study predicted the freight industry is not likely to see the kind of YOY growth it saw last year, which was 30% up from 2019, but noted that some industries would increase at higher rates than others. For example, commodities related to e-commerce and agricultural and food products will be the first to return to growth, whereas electronics and automotive might increase at a slower rate due to declining consumer demand for nonessentials. 

Since the pandemic, the company said it experienced 70% annualized growth, in large part due to expansion into new markets like construction, oil and gas, food and beverage, field services, moving and storage and agriculture. KeepTruckin expects this demand to increase and intends to use the fresh funds to scale rapidly and recruit more talent that will help progress its AI systems, doubling its R&D team to 700 people globally with a focus on engineering, machine vision, data science and other AI areas, says Makani. 

“We think packaging these products into operator-friendly user interfaces for people who are not deeply technical is critical, so front-end and full-stack engineers with experience building incredibly intuitive mobile and web applications are also high priority,” said Makani. 

Much of KeepTruckin’s tech will eventually power autonomous vehicles to make roads safer, says Makani, something that’s also becoming increasingly relevant as the demand for trucking continues to outpace supply of drivers.

Level 4 and eventually level 5 autonomy will come to the trucking industry, but we are still many years away from broad deployment,” he said. “Our AI-powered dashcam is making drivers safer and helping prevent accidents today. While the promise of autonomy is real, we are working hard to help companies realize the value of this technology now.”

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://techcrunch.com/2021/06/17/keeptruckin-raises-190-million-to-invest-in-ai-products-double-rd-team-to-700/

Continue Reading

AI

Entrenched Data Culture Can Pose Challenge to New AI Systems 

Published

on

A legacy company may have an entrenched data culture, with established procedures that may have historically worked well, that make a move to AI systems challenging. (Credit: Getty Images) 

By John P. Desmond, AI Trends Editor 

Companies established for a long time—decades or even a century or more old—with thousands of employees in many business units globally, with information systems built over many years on multiple platforms, have entrenched data cultures that may pose challenges for implementing AI systems.  

Data culture refers to the expectation that data will be used to make decisions and optimize the business, making a company data-driven. A data-driven company can be rolling along peacefully, with complex business processes and operations under control and doing the job. Users may have access to the data they need and be encouraged to present their analysis, even if the insights are unwelcome.   

Then someone asks if the company can do it like Netflix or Amazon, with AI algorithms in the background making recommendations and guiding users along, like a Silicon Valley startup. Might not be able to get there from here.  

Tom O’Toole, professor, Kellogg School of Management

“These great companies may have built enormously successful and admirable businesses,” stated Tom O’Toole, professor at the Kellogg School of Management, writing recently in Forbes.  

However, many legacy companies have IT organization structures and systems that predate the user of data analytics and now AI. The data culture in place may be resistant to change. In many firms, culture is cited as a primary challenge to the successful implementation of AI.   

“Established organizations are too often fragmented, siloed, and parochial in their data use, with entrenched impediments to information sharing,” stated O’Toole, who before working in academia was chief marketing officer at United Airlines. Questions to established authority might not be welcome, especially if the top executive doesn’t like the answers. 

To replicate the Silicon Valley approach, the author had these suggestions:  

Get comfortable with transparency. Data that previously resides only within one department is likely to have to be shared more broadly across the leadership team. Business performance data needs to be transparent.  

Heighten accountability. Greater accountability follows increased transparency. Data needs to be provided to demonstrate that a particular strategy or product launch is effective.  

Embrace unwelcome answers. A data analysis can challenge conventional assumptions, for example by showing performance was less than had been believed, or that the conventional wisdom was not that smart.   

“Creating a data culture is an imperative for continuously advancing business performance and adopting AI and machine learning,” O’Toole stated. 

Survey Shows Concern that Data Quality Issues Will Cause AI to Fail 

Nearly 90% of respondents to a survey by Alation, a company that helps organizations form an effective data culture, are concerned that data quality issues can lead to AI failure.   

Aaron Kalb, cofounder and chief data and analytics officer, Alation

AI fails when it’s fed bad data, resulting in inaccurate or unfair results,” stated Aaron Kalb, cofounder and chief data and analytics officer, in an account on the Alation blogBad data, in turn, can stem from issues such as inconsistent data standards, data non-compliance, and a lack of data democratization, crowdsourcing, and cataloging.” Survey recipients cited these reasons as the main reasons for AI failures. 

The company’s latest survey asked organizations how they are deploying AI and what challenges they are facing doing so. The results showed a correlation between having a top-tier data culture and being more successful at implementing AI systems.  

Data leaders who have deployed AI cite incomplete data as the top issue that leads to AI failures. “This is because when you go searching for data to create the models—be it for product innovation, operational efficiency, or customer experience—you uncover questions around the accuracy, quality, redundancy, and comprehensiveness of the data,” Kalb stated.  

Aretec, a data science-focused firm that works to bring efficiency and automation to federal agencies, helps clients deal with legacy data by leveraging AI services themselves to integrate and optimize huge and diverse datasets.   

In a post on the Aretec blog, the issues they consistently see that impede the implementation of AI systems are:   

Data Fragmentation. Over time, the data needed to support operations winds up fragmented across multiple data silos. Some can be outside an agency or stored with private companies. Fragmented data eventually results in “islands” of duplicated and inconsistent data, incurring infrastructure support costs that are not necessary. 

Data inconsistencies. Many government agencies need to aggregate data records coming from a variety of sources, records not always in a consistent format or content. Even when rigid standards are applied, the standards are likely to evolve over time. The longer the records go back, the greater the chance for variance.  

Learning curves. Many challenges arising from legacy data management are cultural, not technical. Highly-skilled employees have spent years learning how to do their job efficiently and effectively. They may see any proposed change as compromising their position, thus having a negative impact on their productivity and morale.  

NewVantage Survey Find AI Investment Strong, Success Fleeting 

A newly-released survey from NewVantage Partners found that Fortune 1000 companies are investing heavily in data and AI initiatives, with 99% of firms reporting investments. However, the ninth annual update of the survey finds that companies are having difficulty maintaining the momentum, according to a recent account in the Harvard Business Review.  

Two significant trends were found from the 85 companies surveyed. First, companies that have steadily invested in Big Data and AI initiatives report that the pace of investment in those projects is accelerating, with 62% of firms reporting investments of greater than $50 million.   

The second major finding was that even committed companies struggle to derive value from their Big Data and AI investments and from the effort to become data-driven. “Often saddled with legacy data environments, business processes, skill sets, and traditional cultures that can be reluctant to change, mainstream companies appear to be confronting greater challenges as demands increase, data volumes grow, and companies seek to mature their data capabilities,” stated the author, Randy Bean, the CEO and founder of NewVantage Partners, who originated the survey.  

Only 24% of responding firms said they thought their organization was data-driven in the past year, a decline from 37.8% the year before. And 92% of firms reported that they continue to struggle with cultural challenges related to organization alignment, business processes, change management,, communication, people skills sets, resistance and a lack of the understanding needed to enable change.   

“Becoming data-driven takes time, focus, commitment, and persistence. Too many organizations minimize the effort,” stated Bean. 

One recommendation by the study authors was for companies to focus data initiatives on clearly-identified business problems or use cases with high impact.  

Read the source articles and information in Forbeson the Alation blog, on the Aretec blog and in the Harvard Business Review. 

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/ai-and-business-strategy/entrenched-data-culture-can-pose-challenge-to-new-ai-systems/

Continue Reading

AI

Drive AI into Pharma by Asking the Right Questions, Suggest DECODE Speakers 

Published

on

Speakers at the DECODE: AI for Pharmaceuticals forum emphasized the need to ask the right questions to help drive greater adoption. (Credit: Getty Images) 

By Allison Proffitt, Editorial Director, AI Trends   

At last week’s DECODE: AI for Pharmaceuticals forum, pharma leaders discussed the cultural challenges of AI in pharma and what steps their institutions are taking to better incorporate AI in the enterprise.   

Editor’s Note: Learn more about the DECODE event, read an interview with Dominie Roberts, Cambridge Innovation Institute Senior Event Director, and Emma Huang, Senior Director of Data Sciences External Innovation at Johnson & Johnson Innovation, and a member of the DECODE advisory board.   

First, Puneet Batra, director of machine learning at the Broad Institute, argued that pharma research—biology specifically—has a crucial role to play in driving AI and computing research. His work is part of the new Eric and Wendy Schmidt Center at the Broad, which has as its mission to position biology to drive the next era of computing.  

Puneet Batra, Director, Machine Learning, Broad Institute

Batra identified two great revolutions of the 21st Century: the explosion in data technologies (machine learning, cloud, etc.) as well as the blossoming of biological technologies (sequencing, single-cell genomes, medical imaging, etc.). These two revolutions are converging, but the goal is not simply to apply machine learning to biological questions.     

Machine learning, thus far, has been driven by image recognition and predictive accuracy, Batra pointed out. Machine learning needs to move from predictive accuracy to causal modeling, addressing “why” questions instead of only “what” questions. Biology and its unique biological questions should be a key driver to advances in computing.  

Biological questions come with some specific constraints that will shape new machine learning and computing strategies. Data aren’t available at unlimited scales, data reduction runs risks of losing biological complexity, and models applied in the clinic demand a heightened level of scrutiny. But Batra thinks these are the very drivers that should be shaping computing in the future. The goal, he said, is “to make the central questions biology needs to address, this causal aspect, this mechanistic aspect, to make those key needs drivers of additional advances in computing.”    

What Data, Which Problems    

The question-focused approach was a theme throughout the event. Start with a question in mind, several pharma leaders argued in a panel, instead of starting with the data at hand. People tend to focus first on data or algorithms, said Paul Bleicher, founder of PhaseForward, most recently at Optum Labs, and now principal at Evident Health Strategies.   

This approach misses the more fundamental question: What problem are you seeking to solve and how—if solved—will that create value or quality for the business and the patients. Only then, Bleicher said, you begin to ask: “What data would you need? Which of the datasets that we have access to can be used? When will that data potentially create bias? Where will it create issues? Once you have that all together, figure out what algorithms and the way you’ll put it all together.”    

This problem-first approach enables you to think clearly about how much—and what kind—of data you actually need and which tools you’ll use to process it. Be careful of spending all of your available time, money, and resources getting datasets so beautifully cleaned that there is no bandwidth left for using and acting on the data.    

Jacob Janey, Scientific Director, Bristol-Myers Squibb

Jacob Janey, scientific director, chemical and synthetic development at Bristol-Myers Squibb, argued for a minimum viable model approach to both the data needed and the algorithm chosen. Get “good enough” data, which will depend heavily on the question you are seeking to answer or the problem you hope to solve, he said. And then choose an analysis option that is sufficient for its purpose. “People tend to jump to deep learning or neural nets when sometimes it could be a simple regression or a simple random forest, which has its own benefits,” he said.   

Reimagining the AI Org Chart  

Reza Olfati-Saber, PhD, Global Head AI & Deep Analytics, Digital & Data Science R&D, at Sanofi outlined the organizational structure that will undergird a true AI-enabled pharma company. He proposed a pyramid architecture with computing (cloud, infrastructure) as its wide base, advancing through applications (data storage, app development, security), data (data governance and security), analytics (data analytics and visualization), machine learning, and finally AI policy (quality and ethics).   

Olfati-Saber argues that pharma’s data and AI enterprise should be led by a top digital expert and a top AI expert working together. It is “practically impossible” to expect a Chief Data Officer to know the entire pyramid well enough to facilitate a digital transformation, he said. The tag-team approach is essential. “Anything else wouldn’t do the job,” he said.   

Sessions from DECODE: AI for Pharmaceuticals, are now available on demand. 

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/ai-and-pharma/drive-ai-into-pharma-by-asking-the-right-questions-suggest-decode-speakers/

Continue Reading

AI

Fastly Outage Holds Lessons for CDNs and Website Resiliency  

Published

on

The recent Fastly outage drew attention to the role of content data networks in keeping the internet infrastructure resilient. (Credit: Getty Images) 

By John P. Desmond, AI Trends Editor  

On the morning of Tuesday, June 8, many websites went down after an outage at the cloud service firm Fastly, a content data network (CDN) provider.  

Sites affected included Amazon, Hulu, The New York Times, CNN, the Guardian, Bloomberg News, The Financial Times and the Verge. Also affected were the Reddit, Pinterest and Twitch platforms.   

In a post on the Fastly blog on the day of the outage, Nick Rockwell, the company’s senior VP of engineering and infrastructure, stated that a bug was introduced by the company’s own developers by mistake in a software update, and that bug was triggered when a customer modified a CDN configuration, which is a routine procedure.   

Nick Rockwell, Senior VP of Engineering and Infrastructure, Fastly

“We detected the disruption within one minute, then identified and isolated the cause, and disabled the configuration. Within 49 minutes, 95% of our network was operating as normal,” stated Rockwell, who was apologetic. “This outage was broad and severe, and we’re truly sorry for the impact to our customers and everyone who relies on them.” 

The damage radius was wide, causing alarm until the cause of the outage was made known and service started to be restored.  

‘Cascading Failure’ Results from Bug in a Software Update 

Complex cloud-based systems with many dependencies pose risks, especially when things go wrong. “You can end up with these cascading failures,” stated Christopher Meiklejohn, a PhD student at Carnegie Mellon’s Institute for Software Research, in an account from Vox. “They’re difficult to debug. They’re stressful and difficult to resolve. And they can be very difficult to detect early on when you’re thinking about making that change, because the systems are so complex, and they involve so many moving parts.” 

The vast systems of CDNs like Fastly, which is one of many, can involve thousands of servers deployed around the world, Meiklejohn stated, making it more likely an outage will be widespread if an error is introduced in the core software. The fact the bug was missed by Fastly’s quality control process is embarrassing for the company. “We’ll figure out why we didn’t detect the bug during our software quality assurance and testing processes,” Rockwell stated in his post. 

The Vox account likened the Fastly outage to one in 2011 when an Amazon cloud computing system, Elastic Block Store, crashed and took Reddit, Quora and Foursquare offline. After the incident, Amazon stated that one of its engineers inadvertently caused a technical problem that traveled throughout its systems and caused the outage.  

The Fastly outage was referred to as an “object lesson in internet fallibility” in an account in The Financial TimesThe writer of the account stated, “The failure is a reminder that ‘bugs’ lie buried in all new software programs. Maybe artificial intelligence will one day be able to anticipate and fix all the situations in which a piece of software can fail.” 

The CDNs move content closer to users, which improves response times, CDN services include web caching, request routing and server-load balancing, to reduce load times and improve website performance, according to an account from g2which guides users in the selection of software and services.   

Companies that use CDNs include online video streaming providers and e-commerce companies whose services are adversely affected by poor performance. CDN services are often used in conjunction with website hosting services to optimize content delivery speeds.  

Customers have many options for which CDN to employ. G2 listed over 100 CDNs in its account. Fastly was in the top 10, which also included Cloudflare, CloudFront, KeyCDN, Microsoft Azure CDN and Google Cloud CDN. 

Companies with Multiple CDNs Were Able to Shift Workloads 

Some Fastly customers were able to minimize the impact of the outage by shifting workloads to alternate providers, according to an account from ThousandEyesa network intelligence company. The CDNs provide distributed local delivery, without which streaming media services would not be able to provide high quality digital experiences, for example.   

Most CDNs today offer advanced security functionality, and are able to block common malicious traffic, as well as large scale denial of service attacks. Fundamentally, the CDN’s perform two functions: deliver content from their edge nodes to end users and fetch dynamic content from the site origin to deliver at the edge, according to the ThousandEyes account.   

Many popular high-volume sites use more than one CDN provider to deliver content to users, primarily for redundancy but also for optimizing performance. This is done for example by load balancing user requests across multiple CDNs. 

Angelique Medina, Director of Product Marketing, ThousandEyes

“How a site or application owner chooses to architect its content delivery can determine the severity of impact of an outage like the one Fastly experienced,” stated the account’s author, Angelique Medina, Director of Product Marketing for ThousandEyes. “Some of Fastly’s customers had resilient delivery architectures or they were able to take action to mitigate the impact of the incident—leading to very different outcomes for their users,” she noted.  

The company examined the experience of four companies in detail. The New York Times and Reddit each used Fastly’s service as the sold CDN for their primary domains, but the two firms had different experiences. Beginning at 9:50 UTC (5:50 am ET), Reddit was down from around the globe; service was restored about an hour later. 

The New York Times in contact temporarily redirected users to the site’s origin servers hosted on Google Cloud Platform, reducing the downtime of its service for users. The beginning of the outage was similar to the experience of Reddit, but 40 minutes into the outage, the service “significantly increased,” well before Fastly implemented a fix. By 10:50 UTC, no Fastly servers were in the delivery path for the NYT.  

After Fastly implemented its fix, just before 10:50 UTC, the NYTimes users were redirected back to the Fastly servers. By 11:30 UTC, the site was returned to its pre-outage state.  

Amazon uses three CDNs to deliver its site, load balancing traffic across each to deliver the best possible experience to its users. Amazon has its own CDN service, Cloudfront, that is part of its AWS offerings. Amazon also uses Akamai and Fastly to host its site.  

An example of one CDN vantage point showed it targeting Amazon’s site and being directed to a Fastly server just after 8:00 UTC. A few minutes later, it was directed to an Akamai server, and less than 10 minute later it was switched over to an Amazon server. “This active allocation of users across multiple CDN services is part of normal operations for Amazon,” Medina stated. 

Amazon eventually steered users to site components hosted by its own CDN and others, such as Akamai and EdgeCast. By approximately 10:40 UTC, site loading issues had been resolved for most Amazon users. 

Read the source articles and information on the Fastly blog, from Vox, in The Financial Timesfrom g2 and from ThousandEyes.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/infrastructure-for-ai/fastly-outage-holds-lessons-for-cdns-and-website-resiliency/

Continue Reading
Esports1 day ago

World of Warcraft 9.1 Release Date: When is it?

Aviation5 days ago

Delta Air Lines Flight Diverts To Oklahoma Over Unruly Off-Duty Flight Attendant

Aviation4 days ago

Spirit Airlines Just Made The Best Argument For Lifting LaGuardia’s Perimeter Rule

Esports3 days ago

Clash of Clans June 2021 Update patch notes

Blockchain4 days ago

Africa Leading Bitcoin P2P Trading Volume Growth in 2021

Aviation4 days ago

Boeing 727 Set To Be Turned Into Luxury Hotel Experience

Gaming4 days ago

Forza Horizon 5 Announced, Launches November 9

Big Data3 days ago

In El Salvador’s bitcoin beach town, digital divide slows uptake

Blockchain5 days ago

Ripple price analysis: Ripple retests $0.80 support, prepares to push higher? 

HRTech3 days ago

Pre-Owned Luxury Car dealer Luxury Ride to add 80 Employees across functions to boost growth

Blockchain5 days ago

Binance Is Launching a Decentralized NFT Platform

Blockchain3 days ago

Since It Adopted Bitcoin As Legal Tender, The World Is Looking At El Salvador

Blockchain5 days ago

Digital turns physical: Top NFT galleries to visit in-person in 2021

Blockchain5 days ago

Dogecoin Breaches More Demand Zones as Sellers Threaten To Short Further

Blockchain2 days ago

Former PayPal Employees Launch Cross-Border Payment System

Energy2 days ago

XCMG dostarcza ponad 100 sztuk żurawi dostosowanych do regionu geograficznego dla międzynarodowych klientów

Aviation4 days ago

Delta Air Lines Airbus A320 Returns To Minneapolis After Multiple Issues

Blockchain5 days ago

DeFi Deep Dive — Avalanche, DeFi in Under a Second

Blockchain5 days ago

Litecoin price analysis: Litecoin price ready to challenge the $160 mark despite bearish pressure

Gaming4 days ago

Her Story Creator’s Next Game is Immortality, Releases in 2022

Trending