Connect with us

AI

AI Being Tapped to Understand What Whales Say to Each Other 

Avatar

Published

on

AI is being applied to whale research, to understand what whales are trying to communicate with the audible sounds they make in the ocean. (Credit: Getty Images) 

By AI Trends Staff 

AI is being applied to whale research, especially to understand what whales are trying to communicate in the audible sounds they make to each other in the ocean.  

For example, marine biologist Shane Gero has worked to match clicks coming from whales around the Caribbean island nation of Dominica, to behavior he hopes will reveal the meanings of the sounds they make. Gero is a behavioral ecologist affiliated with the Marine Bioacoustics Lab at Aarhus University in Denmark, and the Department of Biology of Dalhousie University of Halifax, Nova Scotia.  

Shane Gero, founder, Dominica Sperm Whale Project

Gero works with a team from Project CETI, a nonprofit that aims to apply advanced machine learning and state-of-the-art robotics to listen to and translate the communication of whales. Project CETI has recently announced a five-year effort to build on Gero’s work with a research project to try to decipher what sperm whales are saying to each other, according to a recent account in National Geographic 

The team includes experts in linguistics, robotics, machine learning, and camera engineering. They will lean on advances in AI which can now translate one human language to another, in what is believed to be the largest interspecies communication effort in history.  

The team has been building specialized video and audio recording devices, which aim to capture millions of whale sounds and analyze them. They hope to gain insight into the underlying architecture of whale chatter.  

“The question comes up: What are you going to say to them? That kind of misses the point,” Gero stated. “It assumes they have a language to talk about us and boats or the weather or whatever we might want to ask them about.”  

The scientists wonder whether whales have grammar, syntax or anything analogous to words and sentences. They plan to track how whales behave when making or hearing clicks. Using advances in natural language processing, researchers will try to interpret this information.   

The Project CETI team includes David Gruber, a professor of biology and environmental science at City University of New York. He became interested in sperm whales while a fellow at Harvard University’s Radcliffe Institute. He wondered if sperm whales could have a communication system that could be called language, that linguists heretofore have thought non-human animals lack. After learning of Gero’s work, the two joined forces.   

Gruber’s machine learning colleagues applied AI techniques to some of Gero’s audio, to identify individual sperm whales from their sounds. The system was right more than 94% of the time. The fact that whales rely almost exclusively on acoustic information, narrowed the task.  

The CETI researchers have spent a year developing an array of high-resolution underwater sensors that will record sound 24 hours a day across a vast portion of Gero’s whale study area off Dominica. Three of these listening systems, each attached to a buoy at the surface, will drop straight down thousands of feet to the bottom, with hydrophones every few hundred meters. 

“We want to know as much as we can,” Gruber stated to National Geographic. “What’s the weather doing? Who’s talking to who? What’s happening 10 kilometers away. Is the whale hungry, sick, pregnant, mating? But we want to be as invisible as possible as we do it.” 

Scientists Staying Sounds of Endangered Beluga Whales in Alaska 

Similar whale research is going on in Alaska, where scientists are using a machine learning application to collect information essential to protect and recover the endangered Cook Inlet beluga whale population, according to a post from NOAA Fisheries. (NOAA is the National Oceanic and Atmospheric Administration, an agency within the US Department of Commerce.)  

In 1979, the Cook Inlet beluga population began a rapid decline. Despite being protected as an endangered species since 2008, the population still shows no sign of recovery and continues to decline.  

Beluga whales live in the Arctic or sub-Arctic. They are vulnerable to many threats such as pollution, extreme weather, and interactions with fishing activity. Underwater noise pollution, which interferes with the whales’ ability to communicate, is a special concern. The Cook Inlet, in Alaska’s most densely populated region, supports heavy vessel traffic, oil and gas exploration, construction, and other noisy human activities. 

The scientists working in Cook Inlet are using passive acoustic monitoring to provide information on beluga movement and habitat use. It also helps scientists identify when noise may be affecting beluga behavior, and ultimately, survival. 

Scientists listen for belugas using a network of moored underwater recorders, which collect enormous volumes of audio data including noise from the natural ocean environment, human activities, and other animals, as well as beluga calls. 

To detect potential beluga signals in these sometimes noisy recordings, scientists have traditionally used a series of basic algorithms. However, the algorithms do not work as well in noisy areas. It’s hard to distinguish faint beluga calls from signals such as creaking ice, ship propellers, and the calls of other cetaceans like killer and humpback whales. It required months of labor-intensive analyses by scientists to remove the false detections and correctly classify beluga calls, until now.  

This year, the NOAA scientists are working with Microsoft AI experts to train AI programs using deep learning techniques. The programs will perform the most tedious, expensive, and time-consuming part of analyzing acoustic data: classifying detections as beluga calls or false signals.   

Manuel Castellote, Bioacoustician, NOAA Alaska Fisheries Science Center

“Deep learning is as close as we can get to how the human brain works,” stated Manuel Castellote, NOAA Fisheries affiliate with the University of Washington, Joint Institute for the Study of the Atmosphere and Ocean, who is leading the study. “And so far the results have been beyond expectation. Machine learning is achieving more than 96% accuracy in classifying detections compared to a scientist doing the classification. It is even picking up things human analysts missed. We didn’t expect it to work as well as humans. Instead, it works better.” 

The machine learning model is highly accurate and can process an enormous amount of data very quickly. “A single mooring dataset, with 6-8 months of sound recordings, would take 10-15 days to manually classify all the detections,” Castellote stated. “With machine learning tools, it is done overnight. Unsupervised.”   

A network of 15 moorings in Cook Inlet is deployed and retrieved twice a year. “Remote sensors, like acoustic moorings, have revolutionized our ability to monitor wildlife populations, but have also created a backlog of raw data,” stated Dan Morris, principal scientist on the Microsoft AI for Earth team. AI is used to automate this data analysis, making it more efficient, This way, scientists “can get back to doing science instead of labeling data,” he stated. 

Simon Fraser University Studying Killer Whale Calls 

In another effort, researchers with Simon Fraser University, a public research university in British Columbia, Canada, are using AI and machine learning on a project to classify whale calls. The goal is to create a warning system to help protect endangered killer whales from potentially fatal ship strikes. 

The project is supported with $568,179 in funding from Fisheries and Oceans Canada under the Oceans Protection Plan–Whale Detection and Collision Avoidance Initiative. 

Ruth Joy, statistical ecologist and lecturer, Simon Fraser University

“Southern resident killer whales are an endangered species and people are very fond of these animals,” stated Ruth Joy, a statistical ecologist and lecturer in SFU’s School of Environmental Science, in a press release from the university. 

 “They want to see that these marine mammals are protected and that we are doing everything that we can to make sure that the Salish Sea is a good home for them.” 

The team is working with citizen scientists and the Orcasound project to provide several terabytes of whale call datasets, being collected by Steven Bergner, a computing science research associate at SFU’s Big Data Hub.  

The acoustic data will be used to ‘teach’ the computer to recognize which call belongs to each type of cetacean, according to Bergner. The project brings together experts from fields such as biology, statistics and machine learning. “In the end, we are developing a system that will be a collaboration between human experts and algorithms,” Bergner stated. 

Orcas or killer whales that are seen along the West Coast are divided into four distinct populations: the salmon-eating southern and northern residents, the transients, which prey on seals or other whales, and offshore, which mostly prey on sharks. Each orca population is further categorized into families called pods. Each pod has its own dialect and each population of orca has calls that differ from the other population.  

Read the source articles and information in National Geographic, from NOAA Fisheries and in a press release from Simon Fraser University. 

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/ai-in-science/ai-being-tapped-to-understand-what-whales-say-to-each-other/

AI

Understanding dimensionality reduction in machine learning models

Avatar

Published

on

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Machine learning algorithms have gained fame for being able to ferret out relevant information from datasets with many features, such as tables with dozens of rows and images with millions of pixels. Thanks to advances in cloud computing, you can often run very large machine learning models without noticing how much computational power works behind the scenes.

But every new feature that you add to your problem adds to its complexity, making it harder to solve it with machine learning algorithms. Data scientists use dimensionality reduction, a set of techniques that remove excessive and irrelevant features from their machine learning models.

Dimensionality reduction slashes the costs of machine learning and sometimes makes it possible to solve complicated problems with simpler models.

The curse of dimensionality

Machine learning models map features to outcomes. For instance, say you want to create a model that predicts the amount of rainfall in one month. You have a dataset of different information collected from different cities in separate months. The data points include temperature, humidity, city population, traffic, number of concerts held in the city, wind speed, wind direction, air pressure, number of bus tickets purchased, and the amount of rainfall. Obviously, not all this information is relevant to rainfall prediction.

Some of the features might have nothing to do with the target variable. Evidently, population and number of bus tickets purchased do not affect rainfall. Other features might be correlated to the target variable, but not have a causal relation to it. For instance, the number of outdoor concerts might be correlated to the volume of rainfall, but it is not a good predictor for rain. In other cases, such as carbon emission, there might be a link between the feature and the target variable, but the effect will be negligible.

In this example, it is evident which features are valuable and which are useless. in other problems, the excessive features might not be obvious and need further data analysis.

But why bother to remove the extra dimensions? When you have too many features, you’ll also need a more complex model. A more complex model means you’ll need a lot more training data and more compute power to train your model to an acceptable level.

And since machine learning has no understanding of causality, models try to map any feature included in their dataset to the target variable, even if there’s no causal relation. This can lead to models that are imprecise and erroneous.

On the other hand, reducing the number of features can make your machine learning model simpler, more efficient, and less data-hungry.

The problems caused by too many features are often referred to as the “curse of dimensionality,” and they’re not limited to tabular data. Consider a machine learning model that classifies images. If your dataset is composed of 100×100-pixel images, then your problem space has 10,000 features, one per pixel. However, even in image classification problems, some of the features are excessive and can be removed.

Dimensionality reduction identifies and removes the features that are hurting the machine learning model’s performance or aren’t contributing to its accuracy. There are several dimensionality techniques, each of which is useful for certain situations.

Feature selection

Feature selection

A basic and very efficient dimensionality reduction method is to identify and select a subset of the features that are most relevant to target variable. This technique is called “feature selection.” Feature selection is especially effective when you’re dealing with tabular data in which each column represents a specific kind of information.

When doing feature selection, data scientists do two things: keep features that are highly correlated with the target variable and contribute the most to the dataset’s variance. Libraries such as Python’s Scikit-learn have plenty of good functions to analyze, visualize, and select the right features for machine learning models.

For instance, a data scientist can use scatter plots and heatmaps to visualize the covariance of different features. If two features are highly correlated to each other, then they will have a similar effect on the target variable, and including both in the machine learning model will be unnecessary. Therefore, you can remove one of them without causing a negative impact on the model’s performance.

Heatmap

Above: Heatmaps illustrate the covariance between different features. They are a good guide to finding and culling features that are excessive.

The same tools can help visualize the correlations between the features and the target variable. This helps remove variables that do not affect the target. For instance, you might find out that out of 25 features in your dataset, seven of them account for 95 percent of the effect on the target variable. This will enable you to shave off 18 features and make your machine learning model a lot simpler without suffering a significant penalty to your model’s accuracy.

Projection techniques

Sometimes, you don’t have the option to remove individual features. But this doesn’t mean that you can’t simplify your machine learning model. Projection techniques, also known as “feature extraction,” simplify a model by compressing several features into a lower-dimensional space.

A common example used to represent projection techniques is the “swiss roll” (pictured below), a set of data points that swirl around a focal point in three dimensions. This dataset has three features. The value of each point (the target variable) is measured based on how close it is along the convoluted path to the center of the swiss roll. In the picture below, red points are closer to the center and the yellow points are farther along the roll.

Swiss roll

In its current state, creating a machine learning model that maps the features of the swiss roll points to their value is a difficult task and would require a complex model with many parameters. But with the help of dimensionality reduction techniques, the points can be projected to a lower-dimension space that can be learned with a simple machine learning model.

There are various projection techniques. In the case of the above example, we used “locally-linear embedding,” an algorithm that reduces the dimension of the problem space while preserving the key elements that separate the values of data points. When our data is processed with the LLE, the result looks like the following image, which is like an unrolled version of the swiss roll. As you can see, points of each color remain together. In fact, this problem can still be simplified into a single feature and modeled with linear regression, the simplest machine learning algorithm.

Swiss roll, projected

While this example is hypothetical, you’ll often face problems that can be simplified if you project the features to a lower-dimensional space. For instance, “principal component analysis” (PCA), a popular dimensionality reduction algorithm, has found many useful applications to simplify machine learning problems.

In the excellent book Hands-on Machine Learning with Python, data scientist Aurelien Geron shows how you can use PCA to reduce the MNIST dataset from 784 features (28×28 pixels) to 150 features while preserving 95 percent of the variance. This level of dimensionality reduction has a huge impact on the costs of training and running artificial neural networks.

dimensionality reduction mnist dataset

There are a few caveats to consider about projection techniques. Once you develop a projection technique, you must transform new data points to the lower dimension space before running them through your machine learning model. However, the costs of this preprocessing step are not comparable to the gains of having a lighter model. A second consideration is that transformed data points are not directly representative of their original features and transforming them back to the original space can be tricky and in some cases impossible. This might make it difficult to interpret the inferences made by your model.

Dimensionality reduction in the machine learning toolbox

Having too many features will make your model inefficient. But cutting removing too many features will not help either. Dimensionality reduction is one among many tools data scientists can use to make better machine learning models. And as with every tool, they must be used with caution and care.

Ben Dickson is a software engineer and the founder of TechTalks, a blog that explores the ways technology is solving and creating problems.

This story originally appeared on Bdtechtalks.com. Copyright 2021

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://venturebeat.com/2021/05/16/understanding-dimensionality-reduction-in-machine-learning-models/

Continue Reading

AI

Bitcoin Mining Company Vows to be Carbon Neutral Following Tesla’s Recent Statement

Avatar

Published

on

Last week, Elon Musk and Tesla shocked the entire crypto industry following an announcement that the electric car company will no longer accept bitcoin payments for “environmental reasons.”

A Hard Pill For Bitcoin Maximalists

Giving its reasons, Tesla argued that Bitcoin mining operation requires massive energy consumption, which is generated from fossil fuel, especially coal, and as such, causes environmental pollution.

The announcement caused a market dip which saw over $4 billion of both short and long positions liquidated as the entire capitalization lost almost $400 billion in a day.

For Bitcoin maximalists and proponents, Tesla’s decision was a hard pill to swallow, and that was evident in their responses to the electric car company and its CEO.

While the likes of Max Keiser lambasted Musk for his company’s move, noting that it was due to political pressure, others like popular YouTuber Chris Dunn were seen canceling their Tesla Cybertruck orders.


ADVERTISEMENT

Adding more fuel to the fire, Musk also responded to a long Twitter thread by Peter McCormack, implying that Bitcoin is not actually decentralized.

Musk Working With Dogecoin Devs

Elon Musk, who named himself the “Dogefather” on SNL, created a Twitter poll, asking his nearly 55 million followers if they want Tesla to integrate DOGE as a payment option.

The poll, which had almost 4 million votes, was favorable for Dogecoin, as more than 75% of the community voted “Yes.”

Following Tesla’s announcement, the billionaire tweeted that he is working closely with Dogecoin developers to improve transaction efficiency, saying that it is “potentially promising.”

Tesla dropping bitcoin as a payment instrument over energy concerns, with the possibility of integrating dogecoin payments, comes as a surprise to bitcoiners since the two cryptocurrencies use a Proof-of-Work (PoW) consensus algorithm and, as such, face the same underlying energy problem.

Elon Musk: Dogecoin Wins Bitcoin

Despite using a PoW algorithm, Elon Musk continues to favor Dogecoin over Bitcoin. Responding to a tweet that covered some of the reasons why Musk easily chose DOGE over BTC, the billionaire CEO agreed that Dogecoin wins Bitcoin in many ways.

Comparing DOGE to BTC, Musk noted that “DOGE speeds up block time 10X, increases block size 10X & drops fee 100X. Then it wins hands down.”

Max Keiser: Who’s The Bigger Idiot?

As Elon Musk continues his lovey-dovey affair with Dogecoin, Bitcoin proponents continue to criticize the Dogefather.

Following Musk’s comments on Dogecoin today, popular Bitcoin advocate Max Keiser took to his Twitter page to ridicule the Tesla boss while recalling when gold bug Peter Schiff described Bitcoin as “intrinsically worthless” after he lost access to his BTC wallet.

“Who’s the bigger idiot?” Keiser asked.

Aside from Keiser, other Bitcoin proponents such as Michael Saylor replied to Tesla’s CEO:

SPECIAL OFFER (Sponsored)

Binance Futures 50 USDT FREE Voucher: Use this link to register & get 10% off fees and 50 USDT when trading 500 USDT (limited offer).

PrimeXBT Special Offer: Use this link to register & enter POTATO50 code to get 50% free bonus on any deposit up to 1 BTC.

You Might Also Like:

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://coingenius.news/bitcoin-mining-company-vows-to-be-carbon-neutral-following-teslas-recent-statement-6/?utm_source=rss&utm_medium=rss&utm_campaign=bitcoin-mining-company-vows-to-be-carbon-neutral-following-teslas-recent-statement-6

Continue Reading

AI

Bitcoin Proponents Against Elon Musk Following Heated Dogecoin vs Bitcoin Tweets

Avatar

Published

on

Last week, Elon Musk and Tesla shocked the entire crypto industry following an announcement that the electric car company will no longer accept bitcoin payments for “environmental reasons.”

A Hard Pill For Bitcoin Maximalists

Giving its reasons, Tesla argued that Bitcoin mining operation requires massive energy consumption, which is generated from fossil fuel, especially coal, and as such, causes environmental pollution.

The announcement caused a market dip which saw over $4 billion of both short and long positions liquidated as the entire capitalization lost almost $400 billion in a day.

For Bitcoin maximalists and proponents, Tesla’s decision was a hard pill to swallow, and that was evident in their responses to the electric car company and its CEO.

While the likes of Max Keiser lambasted Musk for his company’s move, noting that it was due to political pressure, others like popular YouTuber Chris Dunn were seen canceling their Tesla Cybertruck orders.


ADVERTISEMENT

Adding more fuel to the fire, Musk also responded to a long Twitter thread by Peter McCormack, implying that Bitcoin is not actually decentralized.

Musk Working With Dogecoin Devs

Elon Musk, who named himself the “Dogefather” on SNL, created a Twitter poll, asking his nearly 55 million followers if they want Tesla to integrate DOGE as a payment option.

The poll, which had almost 4 million votes, was favorable for Dogecoin, as more than 75% of the community voted “Yes.”

Following Tesla’s announcement, the billionaire tweeted that he is working closely with Dogecoin developers to improve transaction efficiency, saying that it is “potentially promising.”

Tesla dropping bitcoin as a payment instrument over energy concerns, with the possibility of integrating dogecoin payments, comes as a surprise to bitcoiners since the two cryptocurrencies use a Proof-of-Work (PoW) consensus algorithm and, as such, face the same underlying energy problem.

Elon Musk: Dogecoin Wins Bitcoin

Despite using a PoW algorithm, Elon Musk continues to favor Dogecoin over Bitcoin. Responding to a tweet that covered some of the reasons why Musk easily chose DOGE over BTC, the billionaire CEO agreed that Dogecoin wins Bitcoin in many ways.

Comparing DOGE to BTC, Musk noted that “DOGE speeds up block time 10X, increases block size 10X & drops fee 100X. Then it wins hands down.”

Max Keiser: Who’s The Bigger Idiot?

As Elon Musk continues his lovey-dovey affair with Dogecoin, Bitcoin proponents continue to criticize the Dogefather.

Following Musk’s comments on Dogecoin today, popular Bitcoin advocate Max Keiser took to his Twitter page to ridicule the Tesla boss while recalling when gold bug Peter Schiff described Bitcoin as “intrinsically worthless” after he lost access to his BTC wallet.

“Who’s the bigger idiot?” Keiser asked.

Aside from Keiser, other Bitcoin proponents such as Michael Saylor replied to Tesla’s CEO:

SPECIAL OFFER (Sponsored)

Binance Futures 50 USDT FREE Voucher: Use this link to register & get 10% off fees and 50 USDT when trading 500 USDT (limited offer).

PrimeXBT Special Offer: Use this link to register & enter POTATO50 code to get 50% free bonus on any deposit up to 1 BTC.

You Might Also Like:

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://coingenius.news/bitcoin-proponents-against-elon-musk-following-heated-dogecoin-vs-bitcoin-tweets/?utm_source=rss&utm_medium=rss&utm_campaign=bitcoin-proponents-against-elon-musk-following-heated-dogecoin-vs-bitcoin-tweets

Continue Reading

AI

PlotX v2 Mainnet Launch: DeFi Prediction Markets

Avatar

Published

on

In early Sunday trading, BTC prices had fallen to their lowest levels for over 11 weeks, hitting $46,700 before a minor recovery.

The last time Bitcoin dropped to these levels was at the end of February during the second major correction of this ongoing rally. A rebound off that bottom sent prices above $60K for the first time in the two weeks that followed.

Later today, Bitcoin is going to close another weekly candle. In case the candle closes at those levels, this will become the worst weekly close since February 22nd, when BTC ended the week at $45,240, according to Bitstamp. Two weeks ago the weekly candle closed at $49,200, which the current lowest week close since February.

Second ‘Lower Low’ For Bitcoin

This time around, things feel slightly different and the bearish sentiment is returning to crypto-asset markets. Since its all-time high of $65K on April 14, Bitcoin has made a lower high and has now formed a second lower low on the daily chart, which is indicative of a larger downtrend developing.

Analyst ‘CryptoFibonacci’ has been eyeing the weekly chart which also suggests the bulls could be running out of steam.


ADVERTISEMENT

The move appears to have been driven by Elon Musk again with a tweet about Bitcoin’s energy consumption on May 13. Bitcoin’s fear and greed index has dropped to 20 – ‘extreme fear’ – its lowest level since the March 2020 market crash. At the time of press, BTC was trading at just under $48,000, down 4% over the past 24 hours.

Market Cap Shrinks by $150B

As usual, the move has initiated a selloff for the majority of other cryptocurrencies resulting in around $150 billion exiting the markets over the past day or so.

The total market cap has declined to $2.3 trillion after an all-time high of $2.5 trillion on May 12. Things are still high on the long term view but losses could accelerate rapidly if the bearish sentiment increases.

Not all crypto assets are correcting this weekend, and some have been building on recent gains to push even higher – although they are few in number.

Those weekend warriors include Cardano which has added 4.8% on the day to trade at $2.27 according to Coingecko. ADA hit an all-time high on Saturday, May 15 reaching $2.36, a gain of 54% over the past 30 days.

Ripple’s XRP is also seeing a resurgence with a 13% pump on the day to flip Cardano for the fourth spot. XRP is currently trading at $1.58 with a market cap of $73 billion. The only other two cryptocurrencies in the green at the time of writing are Stellar and Solana, gaining 3.7% and 12% respectively.

SPECIAL OFFER (Sponsored)

Binance Futures 50 USDT FREE Voucher: Use this link to register & get 10% off fees and 50 USDT when trading 500 USDT (limited offer).

PrimeXBT Special Offer: Use this link to register & enter POTATO50 code to get 50% free bonus on any deposit up to 1 BTC.

You Might Also Like:

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://coingenius.news/plotx-v2-mainnet-launch-defi-prediction-markets-58/?utm_source=rss&utm_medium=rss&utm_campaign=plotx-v2-mainnet-launch-defi-prediction-markets-58

Continue Reading
AI5 days ago

Build a cognitive search and a health knowledge graph using AWS AI services

Esports4 days ago

‘Destroy Sandcastles’ in Fortnite Locations Explained

Blockchain4 days ago

Shiba Inu: Know How to Buy the New Dogecoin Rival

Blockchain5 days ago

Meme Coins Craze Attracting Money Behind Fall of Bitcoin

Esports5 days ago

Valve launches Supporters Clubs, allows fans to directly support Dota Pro Circuit teams

Blockchain5 days ago

Sentiment Flippening: Why This Bitcoin Expert Doesn’t Own Ethereum

Blockchain4 days ago

Texas House Passes Bill that Recognizes Crypto Under Commercial Law

Aviation4 days ago

American Airlines Continues To Build Up Its Core Hub Strategy

Aviation5 days ago

Reuters: American Airlines adds stops to two flights after pipeline outage

ACN Newswire5 days ago

Duet Protocol closes first-round funding at US$3 million

Cyber Security5 days ago

Pending Data Protection and Security Laws At-A-Glance: APAC

AI5 days ago

Onestream: Data analysis, AI tools usage increased in 2021

Blockchain5 days ago

QAN Raises $2.1 Million in Venture Capital to Build DeFi Ecosystem

Blockchain4 days ago

Facebook’s Diem Enters Crypto Space With Diem USD Stablecoin

Esports4 days ago

Video: s1mple – MVP of DreamHack Masters Spring 2021

Business Insider5 days ago

Rally Expected To Stall For China Stock Market

Blockchain4 days ago

NSAV ANNOUNCES LAUNCH OF VIRTUABROKER’S PROPRIETARY CRYPTOCURRENCY PRICE SEARCH FEATURE

Business Insider4 days ago

HDI Announces Voting Results for Annual General and Special Meeting

AR/VR1 day ago

Next Dimension Podcast – Pico Neo 3, PSVR 2, HTC Vive Pro 2 & Vive Focus 3!

Esports4 days ago

TiMi Studios partners with Xbox Game Studios to bring a “new game sensory experience” to players

Trending