Connect with us


Resistance Is Patriotism on the Fourth of July – @TheAtlantic by Ibram X. Kendi @DrIbram #FourthofJuly




Adafruit 2019 4246

“…We should be celebrating our disobedience, turbulence, insolence, and discontent about inequities and injustices in all forms. We should be celebrating our form of patriotism that they call unpatriotic, our historic struggle to extend power and freedom to every single American. This is our American project.

Because power comes before freedom, not the other way around. Power creates freedom, not the other way around. We can’t be free unless we have power. Freedom is not the power to make choices. Freedom is the power to create choices. And to have the power to shape policy is the power to create choices. That is why power is in the hands of the policy maker.”

Read more.



AI Helping to Manage the Environment, From Sea Ice to Water for Malta




AI is being applied to management of the environment, in projects including monitoring sea ice to better assist vessels in the polar seas. (GETTY IMAGES)

By AI Trends Staff

Researchers at the Norwegian Meteorological Institute are studying how to apply AI to predictions of the spread of sea ice, to make warning to vessels in the polar seas cheaper, faster and more widely available.

The effort is indicative of a trend to apply AI to the management of the environment, with the aim of gaining accuracy and potentially lowering costs.

Efforts to predict sea ice today are resource intensive. “As of now, large resources are needed to create these ice warnings, and most of them are made by The Norwegian Meteorological Institute and similar centers,” stated Sindre Markus Fritzner, a doctoral research fellow at UiT The Arctic University of Norway, in a recent account in

Sindre Markus Fritzner, doctoral research fellow at UiT The Arctic University of Norway

The ice warnings used today are based on dynamic computer models fed with satellite observations of the ice cover, and whatever updated data can be gathered about ice thickness and snow depth. This generates an enormous volume of data, which then needs to be processed by powerful supercomputers.

Fritzner is working on training a machine learning model by loading in data for one week, then data for how it will look one week later. In this way, the machine learns and eventually will be able to make predictions. Once developed, the algorithm will consume less computing power than the processing of today’s physical models.

“If you use artificial intelligence and have a fully trained model, you can run such a calculation on a regular laptop,” Fritzner stated.

In the current system, the predictive models need to be run on supercomputers at the Norwegian Meteorological Institute, which then needs to run the models on a supercomputer and transmit the data back to the vessel. “If equipped with the right program and artificial intelligence, this can be done from the vessel itself, with nearly no computing power required at all,” Fritzner says.

While the outlook is promising, more development work remains. “As long as the changes in the ice were small, the machine learning functioned quite well. When the changes were greater, with a lot of melting, the models struggled more than the physical models,” Fritzner stated. The physical models are constantly adapted to large geophysical changes like increased melting and rapid changes to the weather.

Smart Sensors to Span Continents

A network of smart sensors spanning continents is being constructed by the Sage project, with the help of a $9 million grant from the National Science Foundation awarded in September 2019 to a team led by Northwestern University.

The Sage project calls for advanced machine learning algorithms to be moved to the edge of an Internet of Things (IoT) network, to bring the data analysis very near to the site where data is gathered, according to an account on the website of Sage. The organization is directed by Pete Beckman of Northwestern University, and includes participation by 10 to 15 other academics. The site describes the organization as wanting to build a software-defined sensor network, cyberinfrastructure for AI at the edge.

Pete Beckman, Co-Director, Northwestern Argonne Institute of Science and Engineering

Linking small, powerful, computers directly to high-resolution cameras, air quality and weather sensors, and experimental Light Detection and Ranging (Lidar) systems, this new distributed infrastructure will enable researchers to analyze and respond to data almost instantly. From early detection of wildfire smoke plumes in California to identifying ultrasonic calls of bats or the patterns of pedestrians in a busy crosswalk, Sage’s AI-enabled sensors will give scientists a new tool to understand our planet, the site states.

The distributed, intelligent sensor network will work to understand data on the impacts of global urbanization, natural disasters such as flooding and wildfires, and climate change on natural ecosystems and city infrastructure. Sage will embed computers directly into the sensor network and rely on advancements in edge computing to analyze the torrent of sensor data as it streams past.

Project partner instruments include the NSF-funded Array of Things, the NSF’s National Ecological Observatory Network (NEON), Atmospheric Radiation Measurement (ARM), the High-Performance Wireless Research and Education Network (WIFIRE). Sage will work to integrate measurements from these multiple instruments in the “software-defined” sensor network.

Sage test nodes will be deployed in California, Colorado, and Kansas and in urban settings in Illinois and Texas. The project will build on the open source technology platform used in the Array of Things project, which has deployed more than 100 sensors with edge computing capabilities within Chicago.

University of Arizona Researchers to Help Malta with Water Planning

Researchers at the University of Arizona Center for Innovation have announced that a subsidiary company using technology spun out of the university has been awarded a contract by the European Union to study the water needs of the island of Malta, according to a press release from the university.

Malta, the most densely populated European nation, is an archipelago consisting of one main island and several smaller ones located in the Mediterranean Sea between Sicily and Africa. It ranks among the most water scarce nations in the world. Today, with a booming economy and thriving as a popular tourist destination, Malta is experiencing a “perfect storm” that is becoming all too typical across the globe: too little natural freshwater for its growing demand, made worse by more frequent droughts from climate change.

NOAH Arizona LLC will study the feasibility of implementing its patented water management decision support system on the island, working in partnership with Malta’s Energy and Water Agency and Water Services Corporation, the water utility in Malta.

NOAH’s patented water management system combines real-time data streams with AI and optimization models. The system helps identify optimal water management strategies that minimize costs while maximizing water sustainability and quality. NOAH worked with the startup incubation team at the University of Arizona Center for Innovation to identify markets to prove how AI can be applied to water management challenges that nations like Malta are facing.

“This is a wonderful opportunity for NOAH to further advance our system by collaborating with water experts in Malta. They are on the front-line of some of the world’s most serious and difficult water problems,” stated Emery Coppola, NOAH LLC co-founder and president.

Currently, Malta obtains approximately 60n percent of its water supply from three plants using a reverse osmosis (RO) process for water purification. The remaining water demand is met through extraction of diminishing groundwater via 100 production wells.

A major challenge will be to identify the optimal trade-off between RO and groundwater sources among multiple conflicting objectives given that the water quality from RO is superior to groundwater, whereas the production costs required to produce freshwater by RO are higher than groundwater.

Read the source articles and information at, at Sage, and in the University of Arizona press release.


Continue Reading


Standardization: The Master Key to Unlocking the Full Potential of IoT




Illustration: © IoT For All

Advocates of the Internet of Things (IoT) are no strangers to its potential, knowing it has the ability to aid every facet of the human experience. With market reports estimating 41.6 billion connected devices by 2025 – and industry verticals across the board embracing digital innovation – now more than ever, it is crucial to establish IoT industry standards.

While IoT’s potential is promising, it remains hindered by the inability to fully embrace universal standards to properly manage the security, interoperability, and scalability issues that arise in day-to-day IoT deployments. These issues result in an IoT that is not simple, efficient, or trustworthy. With these barriers, the exponential benefits of IoT are wasted.

Interoperability & Security Barriers

Currently, IoT devices face impending security threats, with preventable security flaws existing at the hardware, software, and network levels. These flaws are exploited by hackers and other ill-meaning actors, leaving end-users with a distrust for connected devices. Due to market demands, manufacturers focus on releasing new products quickly, rather than focusing on ensuring their products are developed with the necessary level of security. This approach leads to the endless reports of security breaches that saturate the news today and creates a negative association when consumers hear “IoT” and “smart home.”

Competing manufacturers aiming to dominate the IoT market results in a severe lack of interoperability amongst most IoT devices and creates a fragmented ecosystem. A lack of device integration and different application frameworks results in an overall negative user experience – ultimately defeating the purpose of the IoT: to create ease of use and simplify daily tasks. IoT standard adoption allows manufacturers to quickly create and implement reliable, secure device discovery and connectivity across operating systems, platforms, transports, and vendors. This bypasses the need to develop to every physical transport or operating system, maximizing interoperability and market scalability.

Since interoperability and security barriers can affect the massive potential of the IoT, it begs the question, why would the industry not do everything in its power to make the IoT fully interoperable and secure? The good news is, there are industry standards that exist today that make this possible, and their benefits can positively impact both the smart home and connected buildings verticals.

The Smart Home Evolution Requires IoT Standardization

For a home to be “smart,” devices must be able to act intelligently based on the information they receive from another device. A connected home allows a homeowner to change the house temperature from a phone app, whereas a smart home has a security system that recognizes your car pulling into the driveway, communicates with the thermostat to change the house temperature, turn on the interior lights and pre-heat the oven.

For this scenario to be made possible, devices must share a universal/standardized IoT language. IoT standards provide the necessary mechanisms to enable easy implementation, communication, and security. Standardized devices and solutions must undergo robust, real-world testing procedures to ensure they work together and meet predetermined security benchmarks. With certified devices, users will no longer need to worry about different network types, devices, or ecosystems as they have been tested and proven to interoperate before they go to market. With the existence of secure, seamless devices, consumers will no longer need to endure difficult installations or devices that are unable to communicate with one another. As a result, end-users are rewarded with convenience and ease, resulting in customer satisfaction and future product purchases.

The adoption of standards allows the connected home to evolve into a truly interoperable smart home, providing manufacturers with the building blocks to ensure customers have the device interconnectivity experience they are looking for.

Standardization Benefits for Connected Buildings

IT managers rely on Building Automation Devices (BACs) to manage a company’s network in a commercial building. This can become complicated, however, as new smart devices are added to legacy networks and have difficulty co-existing. Today, building device applications remain in silos, each with their own proprietary solutions. Building administrators have limited control over individual devices in each domain, and provisioning is complex. IT managers face the difficulties of being locked into one ecosystem. This limits product choice, causing vendor lock-in and a lack of end-to-end secure encryption methods.

Standards provide a set of necessary requirements for devices, including device control and management, for solving the impending issues IT managers face. By utilizing industry standards, lighting control and building management systems can converge into a seamless, secure configuration. With these standards in place, building administrators gain streamlined control over application domains, with real-time monitoring of the shared common network, simpler provisioning, and the possibility to extend this to multiple buildings through the cloud.

Unlock IoT’s Full Potential

These are a few of the many benefits IoT standards have to offer. Vendors have many IoT connectivity options to choose from and these protocols have the capability to co-exist by providing mechanisms to bridge to other IoT ecosystems. Bridging features help to enable the implementation of multiple technologies for a holistic IoT approach. These protocols are available for adoption today and will unlock the massive potential of IoT – the industry simply has to turn the key.


Continue Reading


The Edge vs. The Cloud: A Hybrid Approach for Manufacturing




Illustration: © IoT For All

Edge and cloud computing are often misunderstood to be mutually exclusive but, while they may function in different ways, leveraging one does not preclude the use of the other. In fact, they actually complement one another quite well.

An Introduction to Edge Computing in Manufacturing

The edge computing framework is quickly finding its way into a variety of industries as Internet of Things (IoT) devices become more commonplace. One of the most promising edge computing use cases is in manufacturing, where these new technologies can potentially lead to massive productivity gains. 

While IoT is already proving to be a critical enabler on the factory floor, manufacturers are now looking to enhance the responsiveness of their production systems further. To achieve this, these companies are looking toward smart manufacturing with edge computing as its main enabler.

Smart manufacturing envisions a future where factory equipment can make autonomous decisions based on what’s happening on the factory floor. Businesses can more easily integrate all steps of the manufacturing process including design, manufacturing, supply chain, and operations. This facilitates greater flexibility and reactivity when participating in competitive markets. Enabling this vision requires a combination of related technologies such as IoT, AI/machine learning, and Edge Computing.

The key advantage of gathering analytics at the edge of the network is the ability to analyze and execute on real-time data without the bandwidth costs that come with sending that data offsite (to the cloud or the data center) for analysis. Manufacturing is time-sensitive in terms of avoiding the production of out-of-spec components, equipment downtime, worker injury, or death.  For more complex, longer-term tasks, data can be sent to the cloud and combined with other structured and unstructured forms of data. 

As a result, the use of these two separate computing frameworks is not mutually exclusive, but rather a symbiotic relationship that leverages the benefits each provides. 

Why the Edge for Manufacturing?

For manufacturers, the goal of edge computing is to process and analyze data near a machine that needs to quickly act on that data in a time-sensitive manner. It needs to make a decision right away with no delay.

In a traditional IoT platform set up, the data produced by a device in the field (for all intents and purposes, let’s call that a machine tool) that is collected via an IoT device is relayed back to a central network server (pushed to the cloud, if you will). 

In the cloud, all data is gathered and processed in a centralized location, usually in a data center. All devices that need to access this data or use applications associated with it must first connect to the cloud. Since everything is centralized, the cloud is generally quite easy to secure and control while still allowing for reliable remote access to data.

Once that data is processed (“analyzed”) in the cloud, which happens pretty dang quickly, it can be immediately accessed through an IoT Platform (such as MachineMetrics) in a number of ways, whether it be via real-time visualization, reporting, diagnostic analytics etc., to help improve your ability to make decisions based on real data. 

The problem: the situation gets more complicated when it comes down to decisions that need to be made extremely quickly. 

First, it takes time for data to travel the “distance” from the edge device back to the cloud. This slight delay might only be a matter of milliseconds, but it can be critical for certain decisions such as stopping a machine tool from breaking. 

Secondly, these machines produce a crazy amount of data (hundreds of data points every millisecond) and all that data traveling back and forth between the edge and the cloud strains that communication bandwidth. 

The solution: rather than constantly delivering every piece of this data back to the cloud, edge enabled devices can gather and process data in real-time right there, at the “edge” of the machine, allowing them to respond faster and more effectively.

Edge Use Cases in Action

Let’s now discuss practical reasons for the use of edge computing in manufacturing. There are a variety of business benefits to ensuring that all networks are properly connected to the cloud while also being able to deliver powerful computing resources at the edge.

  1. Improved equipment uptime: A failure in a subsystem, component or the impact of running a component in a degraded state, for instance, can be predicted in real-time, continually refined as more data is analyzed, and used to enhance operational use and maintenance scheduling.
  1. Reduced maintenance costs: Enhanced analysis of needed maintenance also means that more repairs can be completed on first visits by giving mechanics detailed instructions about the causes of a problem, what action is needed, and what parts are required—reducing repair cost.
  1. Lower spare parts inventory: Edge analytics models can be tailored to the requirements of an individual device or system. This might mean reading sensors directly associated with certain components and/or subsystems. Guided by an organization’s desired business value, the edge model can then define how the device or system should be optimally configured to achieve a business goal, making a spare parts inventory vastly more efficient at a minimal cost.
  1. Critical failure prevention: By acquiring, monitoring, and analyzing data regarding components, edge analytics can identify a cause before its effect materializes, enabling earlier problem detection and prevention.
  1. Condition-based monitoring: With the convergence of IT and OT, manufacturers are able to access machine data, allowing them to monitor the condition of their equipment on the shop floor even if they are using legacy equipment.
  1. New business models: Perhaps most important, edge analytics can help shape new business models to capture new opportunities. For example, it can improve just-in-time parts management systems using self-monitoring analysis that predicts which components will fail and when—triggering parts replacement notifications throughout the value chain. This enables the creation of an “as needed” maintenance schedule, reduces downtime and parts inventory, and results in a more efficient model.

So, when you are dealing with a CNC machine tool, in-cycle stoppages to the machine tool are an edge decision, while end-of-cycle ones can be a cloud decision. This is because in-cycle stoppages often require a very low, near-zero, lag time, while the end-of-cycle stoppages have a more lenient lag time. In the former scenario, the machine would have to leverage edge analytics when in-cycle to adapt and shut down the machine automatically in order to avoid potential costly downtime and maintenance. 

It’s Not Edge vs. Cloud…Right?

We know the point of Industrial IoT (IIoT) is to apply advanced analytics to vast quantities of machine data, all with the aim of reducing unplanned downtime, reducing the overall cost of machine maintenance, and leveraging machine learning capabilities. The cloud has been instrumental in making this kind of massive data acquisition, transfer, and analysis possible.

When data speed is the order of the day and connectivity needs to be solid, the edge will be the solution that manufacturers should look to. Applying AI and machine learning algorithms to alert, diagnose, and predict problems in real-time is a goal that can be more readily accomplished with proximity, speed, and a solid network, especially if that goal is to enable your team to take immediate corrective action or to apply an adaptation automatically without human intervention that avoids a costly failure.

To be clear, Edge computing will not replace cloud computing, though the two approaches can complement each other. Cloud computing is a more general-purpose platform for data collection, analytics, and historical reporting, but there are hundreds of use cases where reaction time is the key value of the IoT system, such as certain predictive maintenance events, where sending real-time data to the cloud prevents that analysis from happening quickly enough. 

Manufacturing companies need to be able to make decisions at three different levels: at the machine level, at the factory level, and at the business level. By incorporating edge computing with cloud computing capabilities, companies can maximize the potential of both approaches while minimizing their limitations.


Continue Reading
AR/VR7 hours ago

Gnomes & Goblins to be Wevr’s Biggest Production, 10x Larger Than the Preview

AI8 hours ago

Is It Worth Investing in a Website Builder?

AR/VR8 hours ago

How to Create a Cloud-connect AR Experience in 15 Minutes or Less

AR/VR8 hours ago

Mortal Blitz: Combat Arena’s PlayStation VR Open Beta Begins Next Week

Crowdfunding10 hours ago

AvidXchange Announces New “Tech Rising” Initiative to Remove Barriers to Technology Education

Blockchain10 hours ago

Swipe Is the Latest Project to Integrate Chainlink’s Price Oracles

Blockchain10 hours ago

Craig Wright Won’t Need to Pay Hodlnaut $60K Until Appeal Is Over, Says Counsel

Blockchain11 hours ago

Bitcoin a Hedge Against Elon Musk Mining Asteroid Gold, Say Winklevoss Twins

AR/VR11 hours ago

Solaris Offworld Combat has Been Delayed to September

Crowdfunding11 hours ago

Mastercard Announces Global Commercial Partnership With Pollinate

AR/VR11 hours ago

Oculus Social VR App ‘Venues’ to Get Overhaul in Preparation for ‘Facebook Horizon’

Blockchain11 hours ago

Thailand’s Central Bank Eyes DeFi Use Cases for Its Digital Baht

Blockchain12 hours ago

Bitcoin Proceeds of COVID-19 Business Support Scheme Fraud Seized

AR/VR13 hours ago

VR Giants’ Co-op Kickstarter Achieves Funding Success

Payments15 hours ago

Huntington Bancshares picks BillGo for faster payments

Payments16 hours ago

Banco Ripley goes live on Temenos Transact

Payments16 hours ago

OakNorth’s UK bank has approved £600m in loans since March

Payments16 hours ago

How a “Chad” minted Curve tokens early and briefly surpassed BTC’s market cap

Start Ups16 hours ago

Diplomatic ties Between Israel and UAE :Donald Trump

Publications17 hours ago

As the pandemic persists, New Zealand considers negative interest rates

Publications17 hours ago

Stock futures rise slightly after S&P 500 struggles to reach February record high

Payments17 hours ago

ABN Amro to slash size of investment bank after losses

Cannabis17 hours ago

Weed memes, explained

Publications17 hours ago

The $150 billion video game industry grapples with a murky track record on diversity

AR/VR17 hours ago

Cas & Chary Present: Top 10 ‘Half-Life: Alyx’ Mods So Far

Cleantech18 hours ago

J.B. Hunt’s 1st Delivery With Fully Electric Freightliner eCascadia

Science18 hours ago

Sabesp anuncia resultados do 2T20

Science18 hours ago Announces Pricing of Public Offering of Common Stock

Blockchain18 hours ago

Four of the Top Five South Korean Banks to Offer Crypto Services

Science18 hours ago

SABESP Announces 2Q20 Results

Payments18 hours ago

Alt Lending – week ending 14th August

Science18 hours ago

Brussels Airport Company has selected Ecolog to perform COVID-19 Tests at the Brussels Airport

Publications18 hours ago

Coronavirus live updates: Congress leaves without passing relief bill; Fauci concerned with U.S. outbreak

Blockchain19 hours ago

Is Chainlink Poised for a Sell Off After Reaching New ATH?

Publications19 hours ago

China may never catch up with its commitments to the U.S. in ‘phase one’ deal, expert says

Science19 hours ago

Danke Partners with Leading Chinese Media to Release 2020 College Graduate Housing Blue Book

Blockchain19 hours ago

$12K Bitcoin Price in Sight as Retail, Institutional Traders Turn ‘Greedy’

Blockchain19 hours ago

$99 Gas Fees on Ethereum Are Crippling DeFi’s Growth

Crowdfunding20 hours ago

UK’s Federation of Small Businesses Says Next Budget Must be “Most Pro-Business Ever” to Combat Negative Effects of First Recession in 11 Years

Start Ups20 hours ago

Former New York Times reporter Alex Berenson: I’m increasingly convinced that COVID-19 is a creation of the media/technology complex. (NO – I do not mean it’s not real or was bioengineered)