Connect with us

Big Data

Chinese content platforms pledge self-discipline – industry group

Published

on

SHANGHAI (Reuters) -Chinese content platforms including Weibo and Tencent Video have agreed to enforce more self-discipline to help maintain a “clear” cyberspace environment, a government-affiliated industry association said on Saturday.

Chinese regulators last month cracked down https://www.reuters.com/world/china/china-crack-down-chaotic-online-fan-culture-2021-08-27 on what they call a “chaotic” celebrity fan culture after a series of scandals involving artists. The authorities barred platforms from publishing lists of popular celebrities and ordered fan groups to be regulated.

The China Association of Performing Arts (CAPA) said it met on Friday with platform representatives, who pledged to promote only “healthy” content with positive values, to refrain from using data and traffic as their main guide and to stop encouraging “false hype”.

Fourteen platforms signed the pledge, CAPA said in a WeChat statement, including short video platform Douyin, the Chinese version of TikTok, and news aggregator Jinri Toutiao, both owned by ByteDance.

Tencent said on its official Weibo account that it would work to create a clean and upright online culture. Weibo and ByteDance did not immediately respond to requests for comment.

The platforms would strengthen their management of accounts and restrict those that spread baseless star gossip or stir up conflicts between fan groups, CAPA said. They would also encourage users to actively report illegal content.

“The participating platforms reached a consensus that in order to maintain a clean cyberspace environment and strengthen the construction of online cultural content, companies should carry out more proactive self discipline,” it said.

(Reporting by Brenda Goh; Editing by William Mallard and Alex Richardson)

Image Credit: Reuters

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://datafloq.com/read/chinese-content-platforms-pledge-self-discipline-industry-group/17770

Artificial Intelligence

AIoT: the Perfect Union Between the Internet of Things and Artificial Intelligence

Published

on

AI meets IoT
Illustration: © IoT For All

IoT Without Big Data is Nothing

Imagine Industrial IoT as the nervous system of a company: it is a network of sensors that collects valuable information from all corners of a production plant and stores it in a repository for data analysis and exploitation. This network is necessary to measure and obtain data in order to make informed decisions. But what happens next? what should we do with all that data? We always talk about making good decisions based on reliable information, but although it may sound obvious, it is not always that easy to achieve that goal. In this article, we will go a bit beyond IoT and will focus on the data and how to leverage it with AIoT and data analytics.

We’ll be discussing specifically the analysis phase, the process that turns data first into information and then into knowledge (sometimes also referred to as business logic). In the end, however, we won’t stray far from the core subject of IoT, because for us IoT without Big Data is meaningless.

Big Data and Data Analytics

In recent decades, especially in the ’10’s, we have witnessed an incredible flood of data (both structured and unstructured), mass-produced by the ubiquity of digital technologies. In the particular case of the industrial world, taking advantage of and fully utilizing this huge amount of information is paramount to success.

This need to process business data has given rise to the largely interchangeable terms “Big Data,” “Data Science,” and “Data Analytics,” which we could collectively define as the processes we follow to examine the data captured by our network of devices, with the goal of revealing obfuscated trends, patterns or correlations. This is done with the underlying goal of improving the business with new types of knowledge.

Because it is a recently created term, there are different definitions for Big Data. One of them provided by Gartner outlines 3 key aspects: the volume of data, its variety, and the velocity with which it is captured. These are commonly referred to as the 3 V’s, although other definitions expand on this to include 5 V’s, adding the veracity of the data and the value they bring to the business.

We believe, though, that it does not make much sense to go into theoretical disquisitions on what does and does not qualify as Big Data, because thanks to the ubiquity of data collection devices, Big Data analysis and processing is already applicable to large swaths of the industrial world.

IoT and Big Data

How do IoT and Big Data relate to each other? The main point of connection is usually a database. In general terms, we could say that the work of IoT ends at that database; put another way, the goal of IoT is to dump all the data acquired in a more or less orderly manner in a common repository. The domain of Big Data starts by accessing that repository to manipulate the acquired data and get the information needed.

In any case, it is useful to visualize IoT Big Data Analytics as a toolbox. Depending on the type of information and knowledge we want to acquire from the data, we will draw one tool or another from it. Many of these tools come in the form of traditional algorithms, as well as improvements to or adaptations of those algorithms, with very similar statistical and algebraic principles. These algorithms were not invented in this century, to the surprise of many who wonder why they are now more relevant than before.

The quick answer is that the volume of data available is now much greater than when said algorithms were first conceived, but more importantly, the computing power of today’s machines allows the use of these techniques on a larger scale, giving new uses to old methodologies.

But we don’t want to give the impression that everything has already been invented and that the current trend in data analysis has brought nothing new to the table; quite the opposite in fact. The data ecosystem is very broad and has witnessed significant innovation in recent years.

One of those fastest-growing areas is Artificial Intelligence. It could be argued that this does not count as a recent invention, since this phenomenon was discussed as early as 1956. However, Artificial Intelligence is so broad a concept and its impact so widespread that it is often considered a self-contained discipline. The reality however is that, in some ways, it plays an integral part in Big Data and Data Analytics. It is another of the tools that are already contained in our metaphorical toolbox but found a natural evolution with AIoT.

AIoT: the Artificial Intelligence of Things

The exponential growth in the volume of data requires novel ways of analyzing it. In this context, Artificial Intelligence becomes particularly relevant. According to Forbes, the two main trends that are dominating the technology industry are the Internet of Things (IoT) and Artificial Intelligence.

IoT and AI are two independent technologies that have a significant impact on each other. While IoT can be thought of as the digital nervous system, AI would likewise be an advanced brain that makes the decisions that control the overall system. According to IBM, the true potential of IoT will only be achieved through the introduction of AIoT.

But what is Artificial Intelligence, and how is it different from conventional algorithms?

We usually speak of Artificial Intelligence when a machine mimics the cognitive functions of humans. That is, it solves problems in the same way as a human would, or hypothetically if a machine were able to find new ways of understanding data. AI’s strength is its ability to generate new algorithms to solve complex problems -and this is the key- independently of a programmer’s input. Thus we could think of Artificial Intelligence in general and Machine Learning in particular (which is the segment within AI with the greatest projected potential for growth) as algorithms that invent algorithms.

Edge AI and Cloud AI

The combination of IoT and AI brings us the concept of AIoT (Artificial Intelligence of Things), intelligent and connected systems that are able to make decisions on their own, evaluate the results of these decisions, and improve over time.

This combination can be done in several ways, of which we would like to highlight two:

  1. On the one hand we could continue to conceptualize AI as a centralized system that processes all impulses and makes decisions. In this case we would be referring to a system in the cloud that centrally receives all telemetry and acts accordingly. This would be referred to as Cloud AI (Artificial Intelligence in the Cloud).
  2. On the other hand, we must also talk about a very important part of our metaphorical nervous system: reflexes. Reflexes are autonomous decisions that the nervous system makes without the need to send all the information to the central processor (the brain). These decisions are made in the periphery, close to the source where data was originated. This is called Edge AI (Artificial Intelligence at the Edge).

Use Cases for Edge AI and Cloud AI

Cloud AI provides a thorough analysis process that takes into account the entire system, whereas Edge AI gives us rapidity of response and autonomy. But as with the human body, these two ways of reacting are not mutually exclusive, and can in fact be complementary.

As an example, a water control system can block a valve in the field the moment it detects a leak to prevent major water losses and, in parallel, send a notification to the central system where higher-level decisions can be made, such as opening alternative valves to channel water through another circuit.

The possibilities are endless and can go beyond this simplified example of reactive maintenance, with a sophisticated system able to predict possible events and thus, enabling the possibility of predictive maintenance.

Another example of AIoT data analytics can be found in Smart Grids, where we have smart devices at the edge analyzing the electricity flows at each node and making load balancing decisions locally, while in parallel it sends all this data to the cloud for analysis to generate a more comprehensive, nationwide energy strategy. Macroscopic level analysis would allow load balancing decisions to be made at a regional level, or even decreasing or increasing electricity production by shutting down hydroelectric plants or launching a power purchase process from a neighbouring country.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://www.iotforall.com/aiot-the-perfect-union-between-the-internet-of-things-and-artificial-intelligence

Continue Reading

Big Data

TOMORROW at Virtual SciDatCon 2021: Closing Plenary Session ‘Organisational Interoperability is Critical for Data Interoperability’ – 28 October 2021

Published

on

Tomorrow is the last day of Virtual SciDataCon 2021.  We close the conference with a plenary session.

Organisational Interoperability is Critical for Data Interoperability and Virtual SciDataCon Closing Session, Thursday 28 October, 11:00-13:00 UTC: REGISTER FOR THIS SESSION

The CODATA Decadal Programme ‘Data for the Planet: making data work for cross-domain grand challenges’ addresses part of the ISC Strategy and Action Plan. As part of the Decadal Programme, CODATA has committed to: “… work with other international data organisations and their groups (WDS, RDA Working Groups and GO FAIR Implementation Networks), with cross-domain programmes and research initiatives, with the organisations and communities that create metadata specifications and terminologies, with International Scientific Unions and other stakeholders to enable an ecosystem for FAIR data for cross domain research to be developed and implemented.” In this session, we seek to identify, discuss, and develop approaches and incentives to address known or perceived barriers to global participation, with a particular emphasis on initiatives at the organisational level. The intended outcome of the session is a set of agreed areas of action to facilitate engagement with the Decadal Programme.

The session will be followed by short closing reflections and discussions on the Virtual SciDataCon 2021 Conference.

Virtual SciDataCon 2021 is organised by CODATA and the World Data System, the two data organisations of the International Science Council – PROGRAMME AT A GLANCE – FULL PROGRAMME – please note that registration is free, but participants must register for each session they wish to attend.

CODATA and WDS are very grateful to SpringerNature for sponsorship of Virtual SciDataCon 2021This sponsorship has assisted us in running the conference without a cost-recovery access charge, for which we are extremely grateful.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://codata.org/tomorrow-at-virtual-scidatcon-2021-closing-plenary-session-organisational-interoperability-is-critical-for-data-interoperability/

Continue Reading

Big Data

If you did not already know

Published

on

DataOps google
DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations. From a process and methodology perspective, DataOps applies Agile software development, DevOps software development practices and the statistical process control used in lean manufacturing, to data analytics. In DataOps, development of new analytics is streamlined using Agile software development, an iterative project management methodology that replaces the traditional Waterfall sequential methodology. Studies show that software development projects complete significantly faster and with far fewer defects when Agile Development is used. The Agile methodology is particularly effective in environments where requirements are quickly evolving – a situation well known to data analytics professionals. DevOps focuses on continuous delivery by leveraging on-demand IT resources and by automating test and deployment of analytics. This merging of software development and IT operations has improved velocity, quality, predictability and scale of software engineering and deployment. Borrowing methods from DevOps, DataOps seeks to bring these same improvements to data analytics. Like lean manufacturing, DataOps utilizes statistical process control (SPC) to monitor and control the data analytics pipeline. With SPC in place, the data flowing through an operational system is constantly monitored and verified to be working. If an anomaly occurs, the data analytics team can be notified through an automated alert. DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, agility, quality, security, access and ease of use. …

CoSegNet google
We introduce CoSegNet, a deep neural network architecture for co-segmentation of a set of 3D shapes represented as point clouds. CoSegNet takes as input a set of unsegmented shapes, proposes per-shape parts, and then jointly optimizes the part labelings across the set subjected to a novel group consistency loss expressed via matrix rank estimates. The proposals are refined in each iteration by an auxiliary network that acts as a weak regularizing prior, pre-trained to denoise noisy, unlabeled parts from a large collection of segmented 3D shapes, where the part compositions within the same object category can be highly inconsistent. The output is a consistent part labeling for the input set, with each shape segmented into up to K (a user-specified hyperparameter) parts. The overall pipeline is thus weakly supervised, producing consistent segmentations tailored to the test set, without consistent ground-truth segmentations. We show qualitative and quantitative results from CoSegNet and evaluate it via ablation studies and comparisons to state-of-the-art co-segmentation methods. …

Stochastic Computation Graph (SCG) google
Stochastic computation graphs are directed acyclic graphs that encode the dependency structure of computation to be performed. The graphical notation generalizes directed graphical models. …

Smooth Density Spatial Quantile Regression google
We derive the properties and demonstrate the desirability of a model-based method for estimating the spatially-varying effects of covariates on the quantile function. By modeling the quantile function as a combination of I-spline basis functions and Pareto tail distributions, we allow for flexible parametric modeling of the extremes while preserving non-parametric flexibility in the center of the distribution. We further establish that the model guarantees the desired degree of differentiability in the density function and enables the estimation of non-stationary covariance functions dependent on the predictors. We demonstrate through a simulation study that the proposed method produces more efficient estimates of the effects of predictors than other methods, particularly in distributions with heavy tails. To illustrate the utility of the model we apply it to measurements of benzene collected around an oil refinery to determine the effect of an emission source within the refinery on the distribution of the fence line measurements. …

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://analytixon.com/2021/10/24/if-you-did-not-already-know-1540/

Continue Reading

Big Data

If you did not already know

Published

on

DataOps google
DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations. From a process and methodology perspective, DataOps applies Agile software development, DevOps software development practices and the statistical process control used in lean manufacturing, to data analytics. In DataOps, development of new analytics is streamlined using Agile software development, an iterative project management methodology that replaces the traditional Waterfall sequential methodology. Studies show that software development projects complete significantly faster and with far fewer defects when Agile Development is used. The Agile methodology is particularly effective in environments where requirements are quickly evolving – a situation well known to data analytics professionals. DevOps focuses on continuous delivery by leveraging on-demand IT resources and by automating test and deployment of analytics. This merging of software development and IT operations has improved velocity, quality, predictability and scale of software engineering and deployment. Borrowing methods from DevOps, DataOps seeks to bring these same improvements to data analytics. Like lean manufacturing, DataOps utilizes statistical process control (SPC) to monitor and control the data analytics pipeline. With SPC in place, the data flowing through an operational system is constantly monitored and verified to be working. If an anomaly occurs, the data analytics team can be notified through an automated alert. DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, agility, quality, security, access and ease of use. …

CoSegNet google
We introduce CoSegNet, a deep neural network architecture for co-segmentation of a set of 3D shapes represented as point clouds. CoSegNet takes as input a set of unsegmented shapes, proposes per-shape parts, and then jointly optimizes the part labelings across the set subjected to a novel group consistency loss expressed via matrix rank estimates. The proposals are refined in each iteration by an auxiliary network that acts as a weak regularizing prior, pre-trained to denoise noisy, unlabeled parts from a large collection of segmented 3D shapes, where the part compositions within the same object category can be highly inconsistent. The output is a consistent part labeling for the input set, with each shape segmented into up to K (a user-specified hyperparameter) parts. The overall pipeline is thus weakly supervised, producing consistent segmentations tailored to the test set, without consistent ground-truth segmentations. We show qualitative and quantitative results from CoSegNet and evaluate it via ablation studies and comparisons to state-of-the-art co-segmentation methods. …

Stochastic Computation Graph (SCG) google
Stochastic computation graphs are directed acyclic graphs that encode the dependency structure of computation to be performed. The graphical notation generalizes directed graphical models. …

Smooth Density Spatial Quantile Regression google
We derive the properties and demonstrate the desirability of a model-based method for estimating the spatially-varying effects of covariates on the quantile function. By modeling the quantile function as a combination of I-spline basis functions and Pareto tail distributions, we allow for flexible parametric modeling of the extremes while preserving non-parametric flexibility in the center of the distribution. We further establish that the model guarantees the desired degree of differentiability in the density function and enables the estimation of non-stationary covariance functions dependent on the predictors. We demonstrate through a simulation study that the proposed method produces more efficient estimates of the effects of predictors than other methods, particularly in distributions with heavy tails. To illustrate the utility of the model we apply it to measurements of benzene collected around an oil refinery to determine the effect of an emission source within the refinery on the distribution of the fence line measurements. …

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://analytixon.com/2021/10/24/if-you-did-not-already-know-1540/

Continue Reading
Esports5 days ago

Phasmophobia Nightmare Halloween Update Coming Oct. 25

AR/VR4 days ago

The VR Job Hub: Make Real, Survios, SAIC & Armature Studio

Cleantech4 days ago

Q3 Saw Europe’s EV Share Break New Ground Above 20% & Overtake Diesel For First Time

Blockchain3 days ago

Blockchain Technology Explained – Technology that will change How we use Money

Code3 days ago

ZelioSoft 2 – A Beginners Impression

Aerospace5 days ago

Live coverage: Ariane 5 rocket counting down to launch from French Guiana

Esports5 days ago

Spacestation Gaming’s Frexs talks ALGS, input wars, and building a winning team

Esports5 days ago

Supercell reveals upcoming 2021 Brawl-O-Ween event on Brawl Stars

Aerospace5 days ago

Under watchful eye of NASA, teams prep for final Ariane 5 flight before Webb

Esports5 days ago

How to Change Your BattleTag Name for Free

Esports5 days ago

Cloud9, NRG dominate NA ALGS day 3

Esports5 days ago

Fredy122 talks Rogue’s strength in the LEC’s best-of-one format

Networks2 days ago

What Is a VPN?

Aviation4 days ago

U.S. MC-130J Lands On Highway In Sweden To Unload HIMARS Artillery System During Special Ops Exercise

Esports5 days ago

Pokemon GO November Update Unveiled

Esports5 days ago

Overwatch Basketball Trick Lets Zarya Launch Enemies into the Sky

Esports5 days ago

Only 6 players hold on to perfect pick’ems halfway through the League 2021 Worlds quarterfinals

Energy5 days ago

WORKPRO® Tools Renews Commitment And Increases Pledge To National Breast Cancer Foundation, Inc.®

Esports5 days ago

What do you get in Treasure Packs in Apex Legends?

Startups2 days ago

TOP-5 Useful Programs to Optimize the Work of Office Employees

Trending