Connect with us

AI

This Tiny House Is 3D Printed, Floats, and Will Last Over 100 Years

Avatar

Published

on

One of the world’s first 3D printed houses went up in China in 2016. At 400 square meters in size and 2 stories tall, the house took 45 days to print—and at the time, this seemed amazingly fast.

Since then, similar houses have popped up in other parts of the world, including Russia, the US, Italy, and even an entire community of 3D printed homes in Mexico.

Printing about half-complete. Image credit: ©Buřinka/Kateřina Nováková

This month, another country joined the list: the Czech Republic. Not to be outshined by its predecessors, Prvok, as the house has been christened, even boasts a few extra-cool features: it has a green roof, it’s built to last over 100 years, and it can float (not on its own, though). Printed this month in a warehouse in the southwestern city of České Budějovice, the house will be transported to Střelecký Island on the Vltava River in Prague in August, where it will be open to the public.

Prvok is the brainchild of Czech artist Michal Trpak, who collaborated with bank Buřinka to make his vision a reality. “Architecture is rational, calculated in a way,” Trpak said. “Sculpture is irrational and it’s more about emotion. I like to fuse, experiment, and try new materials and technologies.”

Prvok 3d printed house robotic arm Scoolpt
The Scoolpt robotic arm printing the house. Image Credit: ©Buřinka/David Veis

The house was printed using a robotic arm called Scoolpt, which was tweaked and reprogrammed for this purpose after initially being used on an automotive assembly line. The material used was a concrete mixture enriched with nano-polypropylene fibers and substances to improve plasticity and speed up drying. “I love concrete for many reasons,” Trpak said. “It can be shaped, cast, molded, sprayed, layered… it offers so many possibilities.” It takes 24 hours for the concrete to initially “set,” or harden, but 28 days for it to set to its full load-bearing capacity, which the project’s engineers say is equivalent to that of a bridge.

It took 2 days (not straight through—22 hours of total printing time), 25 workers, and 17 tons of the custom concrete mixture to print the house, which is about 463 square feet (43 square meters). That’s the size of a studio or small one-bedroom apartment, and the space is divided into a living room/kitchen combo, a bedroom, and a bathroom. The project hasn’t released details about the cost of printing the house or its final price tag after completion.

Czech 3d printed house interior
Prvok house interior, artist rendering. Image Credit: ©Buřinka

Though it can stand on land, Prvok was specially designed to live on a pontoon. It’s fitting; with a submarine-like shape and circular porthole-like windows, the house has a distinctly nautical appearance. Trpak claims it can weather at least a hundred years in any environment. “In the future, the owners can crush the building once it has run its useful life, and print it again with the same material directly on the location,” he said.

Granted, 100 years isn’t an extraordinarily long lifespan for a house; there are plenty of them that have been around for that long, and have decades of life ahead of them. But if you consider the speed with which 3D printed houses go up and the simplicity of their construction and materials, it’s a pretty cool build-time-to-longevity ratio.

3D printing has been hailed as a fast, cheap, environment-friendly way to build affordable housing. Earlier this year, a handful of 3D printed homes were added to a community outside Austin, Texas built for people who were previously homeless, and 50 homes are being built for low-income residents of Tabasco, Mexico.

Both of those projects came from Austin-based construction technologies startup ICON, whose co-founder Jason Ballard said, “With 3D printing […] you have the possibility of a quantum leap in affordability. Conventional construction methods have many baked-in drawbacks and problems that we’ve taken for granted for so long that we forgot how to imagine any alternative.”

3D printed houses do have their own drawbacks; they’re most practical in rural areas with low population density, but the world’s biggest need for affordable, safe housing is in or near big cities. The material they can be built from is currently limited to concrete and plastics, which aren’t practical in some climates. And the bare-bones concrete walls that are spit out by a printer can present engineering challenges or limit functionality in the home’s interior.

On the plus side, though, it seems the Czech project has just overcome one big barrier for 3D printed houses: they’re no longer exclusively confined to land.

Banner Image Credit: ©Buřinka

Source: https://singularityhub.com/2020/06/30/this-house-is-3d-printed-floats-and-will-last-over-100-years/

AI

AI Machine Learning Efforts Encounter A Carbon Footprint Blemish

Avatar

Published

on

Self-driving cars leave a measurable carbon footprint from the electricity needed to charge its batteries and to develop and maintain the machine learning models of its AI systems. (GETTY IMAGES)

By Lance Eliot, the AI Trends Insider

Green AI is arising.

Recent news about the benefits of Machine Learning (ML) and Deep Learning (DL) has taken a slightly downbeat turn toward pointing out that there is a potential ecological cost associated with these systems. In particular, AI developers and AI researchers need to be mindful of the adverse and damaging carbon footprint that they are generating while crafting ML/DL capabilities.

It is a so-called “green” or environmental wake-up call for AI that is worth hearing.

Let’s first review the nature of carbon footprints (CFPs) that are already quite familiar to all of us, such as the carbon belching transportation industry.

A carbon footprint is usually expressed as the amount of carbon dioxide emissions spewed forth, including for example when you fly in a commercial plane from Los Angeles to New York, or when you drive your gasoline-powered car from Silicon Valley to Silicon Beach.

Carbon accounting is used to figure out how much a machine or system produces in terms of its carbon footprint when being utilized and can be calculated for planes, cars, washing machines, refrigerators, and just about anything that emits carbon fumes.

We all seem to now know that our cars are emitting various greenhouse gasses including the dreaded carbon dioxide vapors that have numerous adverse environmental impacts. Some are quick to point out that hybrid cars that use both gasoline and electrical power tend to have a lower carbon footprint than conventional cars, while Electrical Vehicles (EV’s) are essentially zero carbon emissions at the tailpipe.

Calculating Carbon Footprints For A Car

When ascertaining the carbon footprint of a machine or device, it is easy to fall into the mental trap of only considering the emissions that occur when the apparatus is in use. A gasoline car might emit 200 grams of carbon dioxide per kilometer traveled, while a hybrid-electric might produce about half at 92 grams, and an EV presumably at 0 grams, per EPA and Department of Energy.

See this U.S. government website for detailed estimates about carbon emissions of cars: https://www.fueleconomy.gov/feg/info.shtml#guzzler

Though the direct carbon footprint aspect does indeed involve what happens during the utilization effort of a machine or device, there is also the indirect carbon footprint that requires our equal attention, involving both upstream and downstream elements that contribute to a fuller picture of the true carbon footprint involved. For example, a conventional gasoline-powered car might generate perhaps 28 percent of its total life-time carbon dioxide emissions when the car was originally manufactured and shipped to being sold.

You might at first be normally thinking like this:

  • Total CFP of a car = CFP while burning gasoline

But it should be more like this:

  • Total CFP of a car = CFP when the car is made + CFP while burning gasoline

Let’s define “CFP Made” as a factor about the carbon footprint when a car is manufactured and shipped, and another factor we’ll call “CFP FuelUse” that represents the carbon footprint while the car is operating.

For the full lifecycle of a car, we need to add more factors into the equation.

There is a carbon footprint when the gasoline itself is being generated, I’ll call it “CFP FuelGen,” and thus we should include not just the CFP when the fuel is consumed but also when the fuel was originally processed or generated. Furthermore, once a car has seen its day and will be put aside and no longer used, there is a carbon footprint associated with disposing or scrapping of the car (“CFP Disposal”).

This also brings up a facet about EV’s. The attention of EV’s as having zero CFP at the tailpipe is somewhat misleading when considering the total lifecycle CFP since you should also be including the carbon footprint required to generate the electrical power that gets charged into the EV and then is consumed while the EV is driving around. We’ll assign that amount to the CFP FuelGen factor.

The expanded formula is:

  • Total CFP of a car = CFP Made + CFP FuelUse + CFP FuelGen + CFP Disposal

Let’s rearrange the factors to group together the one-time carbon footprint amounts, which would be the CFP Made and CFP Disposal, and group together the ongoing usage carbon footprint amounts, which would be the CFP FuelUse and CFP FuelGen. This makes sense since the fuel used and the fuel generated factors are going to vary depending upon how much a particular car is being driven. Presumably, a low mileage driven car that mainly sits in your garage would have a smaller grand-total over its lifetime of the CFP consumption amount than would a car that’s being driven all the time and racking up tons of miles.

The rearranged overall formula is:

  • Total CFP of a car = (CFP Made + CFP Disposal) + (CFP FuelUse + CFP FuelGen)

Next, I’d like to add a twist that very few are considering when it comes to the emergence of self-driving autonomous cars, namely the carbon footprint associated with the AI Machine Learning for driverless cars.

Let’s call that amount as “CFP ML” and add it to the equation.

  • Total CFP of a car = (CFP Made + CFP Disposal) + (CFP FuelUse + CFP FuelGen) + CFP ML

You might be puzzled as to what this new factor consists of and why it is being included. Allow me to elaborate.

AI Machine Learning As A Carbon Footprint

In a recent study done at the University of Massachusetts, researchers examined several AI Machine Learning or Deep Learning systems that are being used for Natural Language Processing (NLP) and tried to estimate how much of a carbon footprint was expended in developing those NLP systems (see the study at this link here: https://arxiv.org/pdf/1906.02243.pdf).

You likely already know something about NLP if you’ve ever had a dialogue with Alexa or Siri. Those popular voice interactive systems are trained via a large-scale or deep Artificial Neural Network (ANN), a kind of computer-based model that simplistically mimics brain-like neurons and neural networks, and are a vital area of AI for having systems that can “learn” based on datasets provided to them.

Those of you versed in computers might be perplexed that the development of an AI Machine Learning system would somehow produce CFP since it is merely software running on computer hardware, and it is not a plane or a car.

Well, if you consider that there is electrical energy used to power the computer hardware, which is used to be able to run the software that then produces the ML model, you could then assert that the crafting of the AI Machine Learning system has caused some amount of CFP via however the electricity itself was generated to power the ML training operation.

According to the calculations done by the researchers, a somewhat minor or modest NLP ML model consumed an estimated 78,468 pounds of carbon dioxide emissions for its training, while a larger NLP ML consumed an estimated 626,155 pounds during training. As a basis for comparison, they report that an average car over its lifetime might consume 126,000 pounds of carbon dioxide emissions.

A key means of calculating the carbon dioxide produced was based on the EPA’s formula of total electrical power consumed is multiplied by a factor of 0.954 to arrive at the average CFP in pounds per kilowatt-hour and as based on assumptions of power generation plants in the United States.

Significance Of The CFP For Machine Learning

Why should you care about the CFP of the AI Machine Learning for an autonomous car?

Presumably, conventional cars don’t have to include the CFP ML factor since a conventional car does not encompass such a capability, therefore the factor would have a value of zero in the case of a conventional car. Meanwhile, for a driverless car, the CFP ML would have some determinable value and would need to be added into the total CFP calculation for driverless cars.

Essentially, it burdens the carbon footprint of a driverless car and tends to heighten the CFP in comparison to a conventional car.

For those of you that might react instantly to this aspect, I don’t think though that this means that the sky is falling and that we should somehow put the brakes on developing autonomous cars, you ought to consider these salient topics:

  • If the AI ML is being deployed across a fleet of driverless cars, perhaps in the hundreds, thousands, or eventually millions of autonomous cars, and if the AI ML is the same instance for each of those driverless cars, the amount of CFP for the AI ML production is divided across all of those driverless cars and therefore likely a relatively small fractional addition of CFP on a per driverless car basis.
  • Autonomous cars are more than likely to be EVs, partially due to the handy aspect that an EV is adept at storing electrical power, of which the driverless car sensors and computer processors slurp up and need profusely. Thus, the platform for the autonomous car is already going to be significantly cutting down on CFP due to using an EV.
  • Ongoing algorithmic improvements in being able to produce AI ML is bound to make it more efficient to create such models and therefore either decrease the amount of time required to produce the models (accordingly likely reducing the electrical power consumed) or can better use the electrical power in terms of faster processing by the hardware or software.
  • For semi-autonomous cars, you can expect that we’ll see AI ML being used there too, in addition to the fully autonomous cars, and therefore the reality will be that the CFP of the AI ML will apply to eventually all cars since conventional cars will gradually be usurped by semi-autonomous and fully autonomous cars.
  • Some might argue that the CFP of the AI ML ought to be tossed into the CFP Made bucket, meaning that it is just another CFP component within the effort to manufacture the autonomous car. And, if so, based on preliminary analyses, it would seem like the CFP AI ML is rather inconsequential in comparison to the rest of the CFP for making and shipping a car.

For those of you interested in trying out an experimental impact tracker in your AI ML developments, there are various tools coming available, including for example this one posted at GitHub that was developed jointly by Stanford University, Facebook AI Research, and McGill University: https://github.com/Breakend/experiment-impact-tracker.

As they say, your mileage may vary in terms of using any of these emerging tracking tools and you should proceed mindfully and with appropriate due diligence for applicability and soundness.

For my framework about AI autonomous cars, see the link here: https://aitrends.com/ai-insider/framework-ai-self-driving-driverless-cars-big-picture/

Why this is a moonshot effort, see my explanation here: https://aitrends.com/ai-insider/self-driving-car-mother-ai-projects-moonshot/

For more about the levels as a type of Richter scale, see my discussion here: https://aitrends.com/ai-insider/richter-scale-levels-self-driving-cars/

For the argument about bifurcating the levels, see my explanation here: https://aitrends.com/ai-insider/reframing-ai-levels-for-self-driving-cars-bifurcation-of-autonomy/

Conclusion

There’s an additional consideration for the CFP of AI ML.

You could claim that there is a CFP AI ML for the originating of the Machine Learning model that will be driving the autonomous car, and then there is the ongoing updating and upgrading involved too.

Therefore, the CFP AI ML is more than just a one-time CFP, it is also part of the ongoing grouping too.

Let’s split it across the two groupings:

  • Total CFP of a car = (CFP Made + CFP Disposal + CFP ML1) + (CFP FuelUse + CFP FuelGen + CFP ML2)

You can go even deeper and point out that some of the AI ML will be taking place in-the-cloud of the automaker or tech firm and then be pushed down into the driverless car (via Over-The-Air or OTA electronic communications), while some of the AI ML might be also occurring in the on-board systems of the autonomous car. In that case, there’s the CFP to be calculated for the cloud-based AI ML and then a different calculation to determine the CFP of the onboard AI ML.

There are some that point out that you can burden a lot of things in our society if you are going to be considering the amount of electrical power that they use, and perhaps it is unfair to suddenly bring up the CFP of AI ML, doing so in isolation of the myriad of other ways in which CFP arises due to any kind of computer-based system.

In the case of autonomous cars, it is also pertinent to consider not just the “costs” side of things, which includes the carbon footprint factor, but also the benefits side of things.

Even if there is some attributable amount of CFP for driverless cars, it would be prudent to consider what kinds of benefits we’ll derive as a society and weigh that against the CFP aspects. Without taking into account the hoped-for benefits, including the potential of human lives saved, the potential for mobility access to all and including the mobility marginalized, and other societal transformations, you get a much more robust picture.

In that sense, we need to figure out this equation:

  • Societal ROI of autonomous cars = Societal benefits – Societal costs

We don’t yet know how it is going to pan out, but most are hoping that the societal benefits will readily outweigh the societal costs, and therefore the ROI for self-driving driverless autonomous cars will be hefty and leave us all nearly breathless as such.

Copyright 2020 Dr. Lance Eliot

This content is originally posted on AI Trends.

[Ed. Note: For reader’s interested in Dr. Eliot’s ongoing business analyses about the advent of self-driving cars, see his online Forbes column: https://forbes.com/sites/lanceeliot/]

Source: https://www.aitrends.com/ai-insider/ai-machine-learning-efforts-encounter-a-carbon-footprint-blemish/

Continue Reading

AI

AI Helping to Transform Education in Pandemic Era

Avatar

Published

on

AI is supporting innovation in education, such as with software to help struggling readers by providing micro-feedback. (GETTY IMAGES)

By AI Trends Staff

The impact of the COVID-19 pandemic on education has been profound, with new ways of thinking about how best to teach students reverberating in institutions of higher learning, K-12 classrooms and in the business community.

The role of AI is central to the discussion on every level. For the K-12 classroom, teachers are thinking about how to use AI as a teaching tool. For example, Deb Norton of the Oshkosh Area school district in Wisconsin, was asked several years ago by the International Society for Technology in Education to lead a course on the uses of AI in K-12 classrooms, according to a recent account in Education Week.

The course includes sections on the definition of artificial intelligence, machine learning, voice recognition, chatbots and the role of data in AI systems. To teach about machine learning, one teacher tied it to yoga, and how the student could do a yoga pose that could be recognized via machine learning, and then the machine could give them feedback on their yoga poses.

Another teacher working with elementary students used the coding site Scratch to create interactive characters and programs such as for creating a skill in Amazon’s Alexis, which are like apps on a smart phone except activated with voice.

Deb Norton, teacher, Oshkosh Area school district, Wisconsin

Asked if she foresees increasing interest in AI as a result of increased remote learning during the pandemic, Norton stated, “AI could become a really big part of virtual learning and at-home learning, but I just don’t think we’re quite there yet. For many of our educators, they’re just dipping their feet into how this would work.”

Protecting privacy is an issue. Many schools will not allow schools to open up Alexa and Google Home out of concern for personal privacy. One workaround could be a school-only network to serve as a test bed.

She does see the potential for AI to help with learning management applications “from a teacher-educator point of view, to be able to engage and monitor and track the types of lessons and strategies that can be delivered in the most effective way in the classroom.”

Investor Sees Disruption Ahead in Higher Education

From an investment point of view, AI in education in the new era represents opportunity. Some see disruption looming in the higher education university system as a result.

“A reckoning is coming for schools and universities,” stated Scott Galloway,  a professor of marketing at the NYU Stern School of Business, in a recent account in TechCrunch.  “We’ve raised prices 1400% but if you walked into a classroom today it wouldn’t look, smell or feel much different from what it did 40 years ago.”

Likening it to a shrinkage in retail – which saw 9,500 closures in 2019 and more than 15,000 so far in 2020 – he predicts a sustained drop in applications for four-year universities, with dozens if not hundreds of colleges and universities unable to recover.

Scott Galloway, professor of marketing, NYU Stern School of Business

Roei Deutsch, co-founder and CEO of live video course marketplace Jolt Inc., stated during a talk on the Coffee Break podcast, “The blow to the world of higher education was bound to come. There is a higher education bubble, something there does not work in terms of cost versus what students receive in return, and you can say that the coronavirus crisis is the beginning of this bubble’s bursting.”

Thus the virus is seen as accelerating a trend that was already underway. The global corporate e-learning market is estimated to grow up to $30 billion at a 13% compound annual growth rate through 2022. “This growth was driven in large part by the increased importance of matching workforce capabilities with actual required skill sets,” stated Joe Apprendi, a general partner at Revel, a venture capital firm formed by business founders, author of the TechCrunch article.

New core education products, as suggested by teacher Norton,  include learning experience platforms (LXP) and learning management systems (LMS), used to monitor, track and administer employment learning activities.

Learning software is primarily designed to create more personalized learning experiences and help users discover new learning opportunities by combining learning content from different sources, while recommending and delivering them — with the support of AI — across multiple digital touch points such as desktop applications and mobile learning apps.

Colleges, universities and enterprises are all looking at these tools. Instead of building training academics to help train people for new or expanded roles in an organization, “Enterprises will now target the front end of the recruiting funnel where higher education begins,” Apprendi suggests. “The potential for global enterprises to own the university experience is suddenly, very real.” The online faculty could be professors from shuttered universities. A hybrid, for-profit model that blends universities and global enterprises could emerge, along the lines of the US Naval Academy, where a tuition-free education comes with an obligation to serve for a period of time.

“Students could see debt cut in half and have a clear path forward toward employment,” he stated. Whatever landscape emerges, changes are in store for universities and colleges.

Software Helping with Remote Learning Challenges

Meanwhile back in K-12 education, the transition to remote learning has been challenging. Many students fail to log into classrooms or complete assignments, according to a recent account in TechRepublic. The number of students logging in has declined by 43% since the start of school closures, and the number of students completing at least one virtual lesson has dropped by 44%, according to a report from Achieve3000. The report was based on data from 1.6 million students across 1,364 school districts.

The transition to e-learning is particularly difficult for struggling readers, who need more time and individual assistance with lessons, the report found. An innovative approach is taken by AI-powered software Amira, designed to remotely help students become better readers.

Amira has been recognized, such as with a nomination for a Codie Award for Best Use of Emerging Technology for Learning in Education. Amira is an intelligent reading assistant designed from decades of research on the science of reading from the University of Texas, and AI in support of reading development from Carnegie Mellon University.

“Amira listens, delivers in-the-moment error-specific feedback, and reports progress for every reading session,” stated Sara Erickson, Amira’s vice president of customer success. “Amira is changing how teachers focus their reading instruction with the help of machine learning to accelerate student reading growth.”

As the student reads, Amira uses AI to decipher what obstacles the young reader is facing, delivering micro-interventions that help to bridge the reading skills gaps. The software helps assess reading fluency, pinpoint errors, and help improve those weaknesses.

“Teachers will never be replaced by software, but they can be supported by it,” Erickson stated. Approximately 125 school districts are currently using Amira, with the K-3 student population in those districts totaling more than 600,000.

Read the source articles in Education Week, TechCrunch and TechRepublic.

Source: https://www.aitrends.com/education/ai-helping-to-transform-education-in-pandemic-era/

Continue Reading

AI

AI-Based Tools Predict COVID-19 Disease Severity

Avatar

Published

on

Researchers are probing the use of fAI and imaging and determine which patients testing positive for COVID-19 are most likely to need extensive treatment. (GETTY IMAGES)

By Paul Nicolaus, Science Writer

Two healthcare workers under the age of 30 fell ill in Wuhan, China, where the first COVID-19 case was reported. One survived. The other wasn’t as fortunate. But why?

It’s an example researchers at the Radiological Society of North America highlighted while pointing out that this phenomenon—some patients falling critically ill and dying as others experience minimal symptoms or none at all—is one of the most mysterious elements of this disease. Mortality does correlate with factors such as age, gender, and some chronic conditions. Considering young and previously healthy individuals have succumbed to this virus, though, there could be more complex prognostic factors involved.

Current diagnostic tests determine whether or not individuals have the virus. They do not, however, offer clues as to just how sick a COVID-positive patient could become. For the time being, clinicians cannot easily predict which patients who test positive will require hospital admission for oxygen and possible ventilation.

Because most cases are mild, identifying those at risk for severe and critical cases early on could help healthcare facilities prioritize care and resources such as ventilators and ICU beds. Figuring out who is at low risk for complications could be useful, too, as this could reduce hospital admissions while these patients are managed at home. As health systems across the globe continue to deal with large numbers of COVID-19 cases, new and emerging technologies may be able to help in this regard.

AI Plus Imaging

Researchers have been probing he use of AI and imaging to determine who has COVID-19, but some groups are taking a different approach and using this same combination to determine which patients are most likely to need the most extensive treatment.

In a paper published July 22 in Radiology: Artificial Intelligence (doi: 10.1148/ryai.2020200079), researchers at Massachusetts General Hospital and Harvard Medical School reveal efforts to develop an automated measure of COVID-19 pulmonary disease severity using chest radiographs (CXRs) and a deep-learning algorithm.

Elsewhere, an international group proposed an AI model that uses COVID-19 patients’ geographical, travel, health, and demographic data to predict disease severity and outcome. Future work is expected to focus on the development of a pipeline that combines CXR scanning models with these types of healthcare data and demographic processing models, according to their paper published July 3 in Frontiers in Public Health (doi: 10.3389/fpubh.2020.00357).

In June, GE Healthcare announced a partnership with the University of Oxford-led National Consortium of Intelligent Medical Imaging (NCIMI) in the UK to develop algorithms aimed at predicting COVID-19 severity, complications, and long-term impact.

Similarly, experts at the University of Copenhagen set out to create models that calculate the risk of a COVID-19 patient’s need for intensive care. The algorithms are designed to find patterns among Danish coronavirus patients who have been through the system to find shared traits among the most severely affected. The patterns are compared with data gathered from recently hospitalized patients, such as X-rays, and sent to a supercomputer to predict how likely a patient is to require a ventilator and how many days will pass before that need arises.

Meanwhile, researchers at Case Western Reserve University are using computers to find details in digital images of chest scans that are not easily seen by the human eye to quickly determine which patients are most likely to experience further deterioration of their health and require the use of ventilators.

“The approach we’ve taken is actually to create a synergistic artificial intelligence algorithm—one that combines patterns from CT scans with clinical parameters based on lab values,” Anant Madabhushi, professor of biomedical engineering at Case Western Reserve and head of the Center for Computational Imaging and Personalized Diagnostics (CCIPD) told Diagnostics World.

Anant Madabhushi, professor of biomedical engineering, Case Western Reserve University

“And the secret sauce, if you will, is the fact that we’re using neural networks and deep learning to automatically go into the CT scans and identify exactly where the region of disease is,” he added. Zeroing in on the disease presentation on the CT scan makes it possible to mine patterns using the neural networks from those regions and combine them with the clinical parameters.

Madabhushi and colleagues have completed a multi-site study that included nearly 900 patients from Wuhan, China, and Cleveland, Ohio. They found that the combination of the clinical parameters and imaging features yielded a higher predictive accuracy in identifying who would go on to need a ventilator compared to a model that uses the imaging features alone and also compared to a model that used only the clinical parameters.

The inspiration for this work came about months ago as Italy hit its peak and the country’s hospitals were overwhelmed with patients who couldn’t breathe. Some of the stories were gut-wrenching, he explained, particularly the ones that highlighted how physicians had to make case by case determinations about who got a ventilator and who didn’t.

“It really got me thinking about what the implications are for the US or the rest of the world,” he said, if a second wave materializes in the fall as some experts have predicted. Of course, we are not out of the first wave yet, he acknowledged, but there is a real concern that a second wave could be even deadlier than the first considering it would take place during flu season.

Madabhushi and colleagues began building their model using images and datasets found online in early March. In April, the CCIPD was offered digital images of chest scans taken from roughly 100 early victims of the novel coronavirus from Wuhan, China. Using that information, the researchers developed machine learning models to predict the risk of a COVID-19 patient needing a ventilator—one based on neural networks and another derived from radiomics.

Early CT scans from patients with COVID-19 showed distinctive patterns specific to those in the intensive care unit (ICU) compared to those not in the ICU. Initially, the research team was able to achieve an accuracy of roughly 70% to 75%. Since then, they have improved upon that performance metric, he said, raising the accuracy level to about 84%.

They have worked to circumvent bias by exposing the AI to patients from different demographics, ethnicities, populations, and scanners. But there’s still work to be done, including additional multi-site testing and prospective field testing. Madabhushi hopes to validate the technology on patients from the Louis Stokes Cleveland VA Medical Center, where he is a research scientist, and is looking to prove the technology at Cleveland Clinic as well.

The team is also developing a user interface that couples the AI with a tool that allows the end-user to enter a CT scan and clinical parameters to see the likelihood of needing a ventilator. Before clinically deploying the technology, he wants to put this in the hands of end-users for additional prospective field testing so that users can get comfortable with the tool, get a sense of how to work with it, and learn how to interpret and use the results coming out of it.

Rather than making arbitrary decisions about who gets a ventilator and who does not, the big hope is that this type of triaging technology could enable more rational decision-making for appropriating resources.

AI and Blood Biomarkers

Another group was also motivated by the scenario that played out in northern Italy back in February and March as a lack of ICU beds led to tough decisions for clinicians.

“Unfortunately, this process, I would say, is a little bit cyclical,” John T. McDevitt, professor of biomaterials at NYU College of Dentistry and professor of chemical and molecular engineering at NYU Tandon School of Engineering told Diagnostics World. Similar scenarios have played out in New York City, for instance, and more recently in Houston. “When you hit this point where you don’t have any buffer, any excess capacity, then it forces a very difficult situation.”

John T. McDevitt, professor of chemical and molecular engineering at NYU Tandon School of Engineering

He wants to provide clinicians with what he describes as “a flashlight that goes into this dark room of COVID-19 severity.” The intent is to look into the future and attempt to figure out which patients will perish unless extreme measures are taken, which patients should be admitted to the hospital, and which patients can safely recover from home.

“I would describe this as the third leg of the stool for the diagnosis and prognosis of COVID-19,” he explained. PCR testing has been used to determine whether individuals have the disease, and serology testing has helped establish whether people have had the condition in the past. The missing leg here, he said, has been determining which patients are going to end up in the hospital and which patients are most likely to perish.

To fill that void, he and colleagues have developed a smartphone app that uses AI and biomarkers in patients’ blood to determine COVID-19 disease severity. Their findings were published June 3 in Lab on a Chip (doi: 10.1039/D0LC00373E).

Relying on data from 160 hospitalized COVID-19 patients in Wuhan, China, they found four biomarkers measured in blood tests that were elevated in the patients who died compared with those who recovered. These biomarkers (C-reactive protein, myoglobin, procalcitonin, and cardiac troponin I) can signal complications relevant to COVID-19, such as reduced cardiovascular health, acute inflammation, or lower respiratory tract infection.

The researchers then developed a model using the biomarkers as well as age and sex—two risk factors. They trained the model to define the patterns of COVID-19 disease and predict its severity. When a patient’s information is entered, the model comes up with a numerical severity score ranging from 0 (mild) to 100 (critical), reflecting the probability of death from the complications of COVID-19.

It was validated using information from 12 hospitalized COVID-19 patients from Shenzhen, China, and further validated using data from over 1,000 New York City patients. The app has also been evaluated in the Family Health Centers at NYU Langone in Brooklyn.

The diagnostic system uses small samples, such as swabs of saliva or drops of blood from a fingertip, which are added to credit card-sized cartridges. The cartridge is put into a portable analyzer that tests for a range of biomarkers, with results available in under 30 minutes. After optimizing the app’s clinical utility, the goal is to roll it out nationwide and worldwide.

Over the coming months, McDevitt’s laboratory, in partnership with SensoDx—a company spun out of his lab—intends to develop and scale the ability to produce a severity score similar to the way people with diabetes check their blood sugar. The plan is to distribute the tool first to disease epicenters to maximize its impact considering not all locations are dealing with a shortage of ICU beds or respirators.

McDevitt also highlighted the potential to help address racial disparities. “COVID has ripped the scab off of this particular wound,” he said. This technology can help level the healthcare playing field and remove some of the unintentional racial or ethnic biases that may weave their way into the delivery of healthcare. By putting the severity score on a numerical index, it arguably provides a more objective way to make challenging pandemic-related healthcare decisions.

McDevitt and colleagues aren’t the only ones pursuing blood-based biomarkers for the prediction of COVID-19 disease severity.

Another example can be found in a study published May 14 in Nature Machine Intelligence (doi: 10.1038/s42256-020-0180-7) and conducted by a group of Chinese researchers, who used a database of blood samples from nearly 500 infected patients in the Wuhan region.

Their machine learning-based model predicts the mortality rates of patients over 10 days in advance with more than 90% accuracy, according to the paper, using three biomarkers: lactic dehydrogenase, lymphocyte, and high-sensitivity C-reactive protein.

Paul Nicolaus is a freelance writer specializing in science, nature, and health. Learn more at www.nicolauswriting.com. This article was originally published in Diagnostics World.

Source: https://www.aitrends.com/ai-research/ai-based-tools-predict-covid-19-disease-severity/

Continue Reading
Esports27 mins ago

Connection error in Fall Guys explained

Esports42 mins ago

90 FPS support in PUBG Moible could be coming for more devices on Sept. 8

Esports1 hour ago

The best quickscoping loadouts in Call of Duty: Modern Warfare and Warzone

AR/VR1 hour ago

Guide for the correct implementation of Virtual Reality in the educational system and universities…

AR/VR1 hour ago

These 3 Factors Stand in The Way of VR Mass Adoption

Esports1 hour ago

Riot Games and Epic Games Apparently Not Affected by Trump’s Tencent Ban

Mobility1 hour ago

How to organize your Android phone photos and screenshots

Esports1 hour ago

Facebook Gaming now has its own dedicated iOS app

Esports2 hours ago

Stuck on loading screen error in Fall Guys explained

AR/VR2 hours ago

‘Firefox Reality’ VR Web Browser Comes to PC in Preview Version

Gaming3 hours ago

Server status – Is Fall Guys down?

Gaming3 hours ago

Evening Reading – August 6, 2020

AR/VR4 hours ago

Gravity Lab Rolls Onto Oculus Quest 20th August

IOT4 hours ago

Authentication In IoT: Securing the Front Door

Energy4 hours ago

NBA Hall of Fame Center and Solar Evangelist Bill Walton Teaming Up With Stellar Solar

AR/VR5 hours ago

Freerunning VR Experience Stride Steps Into Early Access This Month

Gaming5 hours ago

Grounded update 0.1.1 patch notes squashes bugs

Blockchain6 hours ago

BAT, Stellar Lumens, VeChain Price Analysis: 07 August

Payments6 hours ago

This Week in Fintech ending 7th August 2020

Biotechnology6 hours ago

How a protein promotes pancreatic cancer metastasis

Biotechnology6 hours ago

Autolus CMO Peddareddigari departs to return to the US

Fintech6 hours ago

FinTech Connect

Fintech6 hours ago

Australian FinTech – Connecting the Australian FinTech industry to the world

Biotechnology7 hours ago

Fauci: Political pressure won’t interfere with FDA decisions on COVID-19 vaccines

Fintech7 hours ago

Duena Blomstrom

Covid197 hours ago

A Cooking Camp Chef’s Recipe For Remote Education: Make It Ambitious

Fintech7 hours ago

Clarus Financial Technology

Fintech7 hours ago

Core Banking Software Solution & Wallet Engine | SDK.finance

Blockchain7 hours ago

Analyst Explains Reasons Bitcoin Price Could Fall Back to Lower $10Ks

Gaming7 hours ago

Horizon Zero Dawn PC impressions: The disappointing side of Decima

Gaming8 hours ago

Turn Based Strategy RPG ‘Warhammer Quest: Silver Tower’ Releases Next Month on iOS and Android with Pre-Registrations Now Live on Google Play

Networks8 hours ago

VMware gets into apps with Bluetooth-pinging COVID-safe-office tools

Semiconductor8 hours ago

ams’ VCSELs used in Ibeo’s solid-state LiDAR for Great Wall Motor

Biotechnology9 hours ago

AbbVie cuts Editas CRISPR pact it inherited from Allergan

Gaming9 hours ago

Dr Disrespect Was Banned From Twitch, But Now He’s Coming Back On YouTube

Payments9 hours ago

Cambodia payments fintech Clik lands $3.7m

Payments9 hours ago

Interview with John O’Neill of Silent Eight on how to use AI in financial services

Start Ups9 hours ago

Beauty brand MyGlamm acquires women-centric platform POPxo

Publications9 hours ago

UK digital bank Starling’s losses doubled in 2019 — but it expects to break even this year

Gaming9 hours ago

Microsoft Explains Why xCloud Won’t Be On iOS After Prematurely Ending Testing

Trending