Connect with us

AI

Measuring the Social Media Popularity of Pages with DEA in JAVA

Avatar

Published

on

Firefox Browser TemplateIn the previous article we have discussed about the Data Envelopment Analysis technique and we have seen how it can be used as an effective non-parametric ranking algorithm. In this blog post we will develop an implementation of Data Envelopment Analysis in JAVA and we will use it to evaluate the Social Media Popularity of webpages and articles on the web. The code is open-sourced (under GPL v3 license) and you can download it freely from Github.

Update: The Datumbox Machine Learning Framework is now open-source and free to download. Check out the package com.datumbox.framework.algorithms.dea to see the implementation of Data Envelopment Analysis in Java.

Data Envelopment Analysis implementation in JAVA

The code is written in JAVA and can be downloaded directly from Github. It is licensed under GPLv3 so feel free to use it, modify it and redistribute it freely.

The code implements the Data Envelopment Analysis algorithm, uses the lp_solve library to solve the Linear Programming problems and uses extracted data from Web SEO Analytics index in order to construct a composite social media popularity metric for webpages based on their shares on Facebook, Google Plus and Twitter. All the theoretical parts of the algorithm are covered on the previous article and in the source code you can find detailed javadoc comments concerning the implementation.

Below we provide a high level description of the architecture of the implementation:

1. lp_solve 5.5 library

In order to solve the various linear programming problems, we use an open source library called lp_solve. The particular lib is written in ANSI C and uses a JAVA wrapper to invoke the library methods. Thus before running the code you must install lp_solve on your system. Binaries of the library are available both for Linux and Windows and you can read more information about the installation on lp_solve documentation.

Please make sure that the particular library is installed on your system before trying to run the JAVA code. For any problem concerning installing and configuring the library please refer to the lp_solve documentation.

2. DataEnvelopmentAnalysis Class

This is the main class of the implementation of DEA algorithm. It implements a public method called estimateEfficiency() which takes a Map of records and returns their DEA scores.

3. DeaRecord Object

The DeaRecord is a special Object that stores the data of our record. Since DEA requires separating the input and output, the DeaRecord Object stores our data separately in a way that DEA can handle it.

4. SocialMediaPopularity Class

The SocialMediaPopularity is an application which uses DEA to evaluate the popularity of a page on Social Media networks based on its Facebook likes, Google +1s, and Tweets. It implements two protected methods the calculatePopularity() and the estimatePercentiles() along with two public methods the loadFile() and the getPopularity().

The calculatePopularity() uses the DEA implementation to estimate the scores of the pages based on their social media counts. The estimatePercentiles() method gets the DEA scores and converts them into percentiles. In general percentiles are easier to explain than DEA scores; thus when we say that the popularity score of a page is 70% it means that the particular page is more popular than the 70% of the pages.

In order to be able to estimate the popularity of a particular page, we must have a dataset with the social media counts of other pages. This makes sense since in order to predict which page is popular and which is not, you must be able to compare it with other pages on the web. To do so, we use a small anonymized sample from Web SEO Analytics index provided in txt format. You can build your own database by extracting the social media counts from more pages on the web.

The loadFile() method is used to load the aforementioned statistics on DEA and the getPopularity() method is an easy to use method that gets the Facebook likes, Google +1s and the number of Tweets of a page and evaluates its popularity on social media.

Using the Data Envelopment Analysis JAVA implementation

In the DataEnvelopmentAnalysisExample Class I provide 2 different examples of how to use the code.

The first example uses directly the DEA method to evaluate the efficiency of organizational units based on their output (ISSUES, RECEIPTS, REQS) and input (STOCK, WAGES). This example was taken from an article of DEAzone.com.

 Map<String, DeaRecord> records = new LinkedHashMap<>(); records.put("Depot1", new DeaRecord(new double[]{40.0,55.0,30.0}, new double[]{3.0,5.0})); //...adding more records here... DataEnvelopmentAnalysis dea = new DataEnvelopmentAnalysis(); Map<String, Double> results = dea.estimateEfficiency(records); System.out.println((new TreeMap<>(results)).toString());

The second example uses our Social Media Popularity application to evaluate the popularity of a page by using data from Social Media such as Facebook Likes, Google +1s and Tweets. All social media counts are marked as output and we pass to DEA an empty input vector.

 SocialMediaPopularity rank = new SocialMediaPopularity(); rank.loadFile(DataEnvelopmentAnalysisExample.class.getResource("/datasets/socialcounts.txt")); Double popularity = rank.getPopularity(135, 337, 9079); //Facebook likes, Google +1s, Tweets System.out.println("Page Social Media Popularity: "+popularity.toString());

Necessary Expansions

The provided code is just an example of how DEA can be used as a ranking algorithm. Here are few expansions that must be made in order to improve the implementation:

1. Speeding up the implementation

The particular DEA implementation evaluates the DEA scores of all the records in the database. This makes the implementation slow since we require solving as many linear programming problems as the number of records in database. If we don’t require calculating the score of all the records then we can speed up the execution significantly. Thus a small expansion of the algorithm can give us better control over which records should be solved and which should be used only as constrains.

2. Expanding the Social Media Counts Database

The provided Social Media Counts Database consists of 1111 samples from Web SEO Analytics index. To be able to estimate a more accurate popularity score, a larger sample is necessary. You can create your own database by estimating the social media counts from more pages of the web.

3. Adding more Social Media Networks

The implementation uses the Facebook Likes, the Google +1s and the number of Tweets to evaluate the popularity of an article. Nevertheless metrics from other social media networks can be easily taken into account. All you need to do is build a database with the social media counts from the networks that you are interested in and expand the SocialMediaPopularity class to handle them accordingly.

Final comments on the implementation

To be able to expand the implementation you must have a good understanding of how Data Envelopment Analysis works. This is covered on the previous article, so please make sure you read the tutorial before you proceed to any changes. Moreover in order to use the JAVA code you must have installed in your system the lp_solve library (see above).

If you use the implementation in an interesting project drop us a line and we will feature your project on our blog. Also if you like the article, please take a moment and share it on Twitter or Facebook.

About Vasilis Vryniotis

My name is Vasilis Vryniotis. I’m a Data Scientist, a Software Engineer, author of Datumbox Machine Learning Framework and a proud geek. Learn more

Source: http://blog.datumbox.com/measuring-the-social-media-popularity-of-pages-with-dea-in-java/

AI

How 5G Will Impact Customer Experience?

Avatar

Published

on

5G is the breakthrough technology promised to bring new innovations, change the way people are traversing through the Internet with its faster connection speeds, lower latency, high bandwidth, and ability to connect one million devices per square kilometre. Telcos are deploying 5G to enhance our day-to-day lives.

“When clubbed with other technologies like Artificial Intelligence, Internet of Things (IoT), it could mean a lot to a proliferation of other technologies like AR/VR, data analytics.” 

5G can be a boon for businesses with the delivery of increased reliability, efficiency and performance if it can be used to drive more value to the customers as well as the business stakeholders and meet their expectations with the help of digital technologies as mentioned below:

Consumer Expectations are on the Rise

In modern days, customer service teams provide and manage customer support via call centres and digital platforms. The rollout of 5G is expected to unleash more benefits with a positive impact on customer service as they improve their present personalized service offerings to customers and allow it to further create new solutions that could develop their customer engagement to win great deals.

For instance, salespeople in a retail store are being imbibed with layers of information about customers’ behaviour and choices that will help them build a rich and tailored experience for the customers walking down the store.

Video Conferencing/streaming is Just a Few Clicks Away

Video support is considered to be a critical part of Consumer Experience (CX) and will open new avenues for consumer-led enterprises.

“As per a survey conducted by Oracle with 5k people, 75% of people understand the efficiency and value of video chat and voice calls.” 

CX representatives used the video support feature to troubleshoot highly technical situations through video chat and screen sharing options with few clicks, potentially reducing the number of in-house technician visits during critical situations like coronavirus pandemic.

Also, nowadays video conferencing is facilitated with an option to record a quick instant video describing the process/solution and discarding the long process of sending step-by-step emails. Enterprises can develop advanced user guide for troubleshooting issues featuring video teasers for resolving common problems.

However, high-definition video quality is preferable for video conferencing, chat and demands for an uninterrupted network with smooth video streaming. This means operators need to carry out network maintenance activities on regular intervals to check whether there is any kind of 5G PIM formation on these network cell towers that could reduce receive sensitivity and performance, thereby deteriorating network speed, video resolution etc.

Thus, PIM testing becomes critical for delivering enhanced network services without interference, necessary for high-resolution online video conferencing, chats, and many more.

Increased Smart Devices and the Ability to Troubleshoot via Self-Service

The inception of 5G will give a boost to the IoT and smart device market which is already growing.

These smart devices IoT connections are expected to become twice in number between 2019 and 2025 i.e. more than 25Bn as per the GSM association which is an industry organization representing telecom operators across the globe.

With lower latency and improvisation in reliability, 5G has a lot more to offer as it connects a large number of devices. This will ultimately curb the manpower needed for customer support thereby reducing labour costs for the enterprise. Moreover, these IoT connected devices and high-speed network of 5G permit consumers to self-troubleshoot these devices at their own homes.

In order to facilitate these high-resolution networks, telecom operators need to perform 5G network testing and identify issues, take corrective actions that could improve their network and integrate with advanced capabilities, making it more efficient than previous connections with the wider network coverage.

Enhanced Augmented Reality (AR) / Virtual Reality (VR) Capabilities

As these tools are being widely used, customers are provided with virtual stores or immersive experiences using AR to view a sneak peek of the products in their house in real-time.

“‘Augmented Retail: The New Consumer Reality’ study by Nielsen in 2019 suggested that AR/VR has created a lot of interest in people and they are willing to use these technologies to check out products.” 

Analysis of Bulk Data With Big Data Analytics

Enterprises have to deal with a huge volume of data daily. 5G has the ability to collect these data and with its advanced network connectivity across a large number of devices, it delivers faster data analytics too.

Companies will be able to process this vast amount of unstructured data sets combined with Artificial Intelligence (AI) to extract meaningful insights and use them for drafting business strategies like using customer behaviour data sets to study their buying behaviour and targeting such segment with customized service offerings as per their requirement.

As per Ericsson’s AI in networks report, 68% of Communications Service Providers (CSPs) believe improving CX is a business objective while more than half of them already believe AI will be a key technology that will assist in improving the overall CX. Thus, big data analytics will be crucial for harnessing all new data and enhance the customer experience.

Conclusion

Looking from a CX point of view, 5G benefits will far extend beyond the experience of a citizen. Real-time decisions will accelerate with the prevalence of 5G and application of other new-age technologies like AI, ML, IoT, etc. As 5G deployment will continue to grow, so is the transition of each trending processes mentioned above that will ultimately improve your business in terms of productivity, gain a large customer base and bring more revenues.

Source: https://www.aiiottalk.com/technology/5g-impact-on-customer-experience/

Continue Reading

AI

Resiliency And Security: Future-Proofing Our AI Future

Avatar

Published

on

Deploying AI in the enterprise means thinking forward for resiliency and security (GETTY IMAGES)

By Allison Proffitt, AI Trends

On the first day of the Second Annual AI World Government conference and expo held virtually October 28-30, a panel moderated by Robert Gourley, cofounder & CTO of OODA, raised the issue of AI resiliency. Future-proofing AI solutions requires keeping your eyes open to upcoming likely legal and regulatory roadblocks, said Antigone Peyton, General Counsel & Innovation Strategist at Cloudigy Law. She takes a “use as little as possible” approach to data, raising questions such as: How long do you really need to keep training data? Can you abstract training data to the population level, removing some risk while still keeping enough data to find dangerous biases?

Stephen Dennis, Director of Advanced Computing Technology Centers at the U.S. Department of Homeland Security, also recommended a forward-looking posture, but in terms of the AI workforce. In particular, Dennis challenged the audience to consider the maturity level of the users of new AI technology. Full automation is not likely a first AI step, he said. Instead, he recommends automating slowly, bringing the team along. Take them a technology that works in the context they are used to, he said. They shouldn’t need a lot of training. Mature your team with the technology. Remove the human from the loop slowly.

Of course, some things will never be fully automated. Brian Drake, U.S. Department of Defense, pointed out that some tasks are inherently human-to-human interactions—such as gathering human intelligence. But AI can help humans do even those tasks better, he said.

He also cautioned enterprises to consider their contingency plan as they automate certain tasks. For example, we rarely remember phone numbers anymore. We’ve outsourced that data to our phones while accepting a certain level of risk. If you deploy a tool that replaces a human analytic activity, that’s fine, Drake said. But be prepared with a contingency plan, a solution for failure.   

Organizing for Resiliency

All of these changes will certainly require some organizational rethinking, the panel agreed. While government is organized in a top down fashion, Dennis said, the most AI-forward companies—Uber, Netflix—organize around the data. That makes more sense, he proposed, if we are carefully using the data.

Data models—like the new car trope—begin degrading the first day they are used. Perhaps the source data becomes outdated. Maybe an edge use case was not fully considered. The deployment of the model itself may prompt a completely unanticipated behavior. We must capture and institutionalize those assessments, Dennis said. He proposed an AI quality control team—different from the team building and deploying algorithms—to understand degradation and evaluate the health of models in an ongoing way. His group is working on this with sister organizations in cyber security, and he hopes the best practices they develop can be shared to the rest of the department and across the government.

Peyton called for education—and reeducation—across organizations. She called the AI systems we use today a “living and breathing animal”. This is not, she emphasized, an enterprise-level system that you buy once and drop into the organization. AI systems require maintenance, and someone must be assigned to that caretaking.

But at least at the Department of Defense, Drake pointed out, all employees are not expected to become data scientists. We’re a knowledge organization, he said, but even if reskilling and retraining are offered, a federal workforce does not have to universally accept those opportunities. However, surveys across DoD have revealed an “appetite to learn and change”, Drake said. The Department is hoping to feed that curiosity with a three-tiered training program offering executive-level overviews, practitioner-level training on the tools currently in place, and formal data science training. He encouraged a similar structure to AI and data science training across other organizations.

Bad AI Actors

Gourley turned the conversation to bad actors. The very first telegraph message between Washington DC and Baltimore in 1844 was an historic achievement. The second and third messages—Gourley said—were spam and fraud. Cybercrime is not new and it is absolutely guaranteed in AI. What is the way forward, Gourley asked the panel.

“Our adversaries have been quite clear about their ambitions in this space,” Drake said. “The Chinese have published a national artificial intelligence strategy; the Russians have done the same thing. They are resourcing those plans and executing them.”

In response, Drake argued for the vital importance of ethics frameworks and for the United States to embrace and use these technologies in an “ethically up front and moral way.” He predicted a formal codification around AI ethics standards in the next couple of years similar to international nuclear weapons agreements now.

Source: https://www.aitrends.com/ai-world-government/deploying-ai-in-the-enterprise-means-thinking-forward-for-resiliency-and-security/

Continue Reading

AI

AI Projects Progressing Across Federal Government Agencies

Avatar

Published

on

The AI World Government Conference kicked off virtually on Oct. 28 and continues on Oct. 29 and 30. Tune in to learn about AI strategies and plans of federal agencies. (Credit: Getty Images)

By AI Trends Staff

Government agencies are gaining experience with AI on projects, with practitioners focusing on defining the project benefit and the data quality is good enough to ensure success. That was a takeaway from talks on the opening day of the Second Annual AI World Government conference and expo held virtually on October 28.

Wendy Martinez, PhD, director of the Mathematical Statistics Research Center, US Bureau of Labor Statistics

Wendy Martinez, PhD, director of the Mathematical Statistics Research Center, with the Office of Survey Methods Research in the US Bureau of Labor Statistics, described a project to use natural language understanding AI to parse text fields of databases, and automatically correlate them to job occupations in the federal system. One lesson learned was despite interest in sharing experience with other agencies, “You can’t build a model based on a certain dataset and use the model somewhere else,”  she stated. Instead, each project needs its own source of data and model tuned to it.

Renata Miskell, Chief Data Officer in the Office of the Inspector General for the US Department of Health and Human Services, fights fraud and abuse for an agency that oversees over $1 trillion in annual spending, including on Medicare and Medicaid. She emphasized the importance of ensuring that data is not biased and that models generate ethical recommendations. For example, to track fraud in its grant programs awarding over $700 billion annually, “It’s important to understand the data source and context,” she stated. The unit studied five years of data from “single audits” of individual grant recipients, which included a lot of unstructured text data. The goal was to pass relevant info to the audit team. “It took a lot of training, she stated. “Initially we had many false positives.” The team tuned for data quality and ethical use, steering away from blind assumptions. “If we took for granted that the grant recipients were high risk, we would be unfairly targeting certain populations,” Miskell stated.

Dave Cook, senior director of AI/ML Engineering Services, Figure Eight Federal

In the big picture, many government agencies are engaged in AI projects and a lot of collaboration is going on. Dave Cook is senior director of AI/ML Engineering Services for Figure Eight Federal, which works on AI projects for federal clients. He has years of experience working in private industry and government agencies, mostly now the Department of Defense and intelligence agencies. “In AI in the government right now, groups are talking to one another and trying to identify best practices around whether to pilot, prototype, or scale up,” he said. “The government has made some leaps over the past few years, and a lot of sorting out is still going on.”

Ritu Jyoti, Program VP, AI Research and Global AI Research lead for IDC consultants, program contributor to the event, has over 20 years of experience working with companies including EMC, IBM Global Services, and PwC Consulting. “AI has progressed rapidly,” she said. From a global survey IDC conducted in March, business drivers for AI adoption were found to be better customer experience, improved employee productivity, accelerated innovation and improved risk management. A fair number of AI projects failed. The main reasons were unrealistic expectations, the AI did not perform as expected, the project did not have access to the needed data, and the team lacked the necessary skills. “The results indicate a lack of strategy,” Joti stated.

David Bray, PhD, Inaugural Director of the nonprofit Atlantic Council GeoTech Center, and a contributor to the event program, posted questions on how data governance challenges the future of AI. He asked what questions practitioners and policymakers around AI should be asking, and how the public can participate more in deciding what can be done with data. “You choose not to be a data nerd at your own peril,” he said.

Anthony Scriffignano, PhD, senior VP & Chief Data Scientist with Dun & Bradstreet, said in the pandemic era with many segments of the economy shut down, companies are thinking through and practicing different ways of doing things. “We sit at the point of inflection. We have enough data and computer power to use the AI techniques invented generations ago in some cases,” he said. This opportunity poses challenges related to what to try and what not to try, and “sometimes our actions in one area cause a disruption in another area.”

AI World Government continues tomorrow and Friday.

(Ed. Note: Dr. Eric Schmidt, former CEO of Google is now chair of the National Security Commission on AI, today was involved in a discussion, Transatlantic Cooperation Around the Future of AI, with Ambassador Mircea Geoana, Deputy Secretary General, North Atlantic Treaty Organization, and Secretary Robert O. Work, vice chair of the National Security Commission. Convened by the Atlantic Council, the event can be viewed here.)

Source: https://www.aitrends.com/ai-world-government/ai-projects-progressing-across-federal-government-agencies/

Continue Reading
Press Releases2 mins ago

SHANGHAI, Oct 26, 2020 – (ACN Newswire)

Start Ups24 mins ago

CB Insights: Trends, Insights & Startups from The Fintech 250

Press Releases30 mins ago

Valarhash Launches New Service Series for its Mining Hosting Operations

zephyrnet1 hour ago

Trends, Insights & Startups from The Fintech 250

Cannabis6 hours ago

Current Research on Effect Specific Uses of Cannabis

Covid197 hours ago

How Telemedicine Can Help Keep Your Health on Track

Start Ups7 hours ago

Website Packages – Good or Evil?

Blockchain8 hours ago

Self-Sovereign Decentralized Digital Identity

Cyber Security14 hours ago

Best Moon Lamp Reviews and Buying Guide

Cyber Security17 hours ago

Guilford Technical Community College Continues to Investigate a Ransomware Cyberattack

Cyber Security20 hours ago

IOTW: Will There Be An Incident Of Impact On Tuesday’s Election?

Blockchain News23 hours ago

Mastercard and GrainChain Bring Blockchain Provenance to Commodity Supply Chain in Americas

AR/VR1 day ago

Win a Copy of Affected: The Manor for Oculus Quest

AR/VR1 day ago

The Steam Halloween Sale has Begun With Themed Activities and Updates

AR/VR1 day ago

Warhammer Age of Sigmar: Tempestfall Announced for PC VR & Oculus Quest, Arrives 2021

Crowdfunding1 day ago

I Dare You to Ignore This Trend

Blockchain News1 day ago

Bitcoin Price Flashes $750M Warning Sign As 60,000 BTC Options Set To Expire

AR/VR1 day ago

Star Wars: Tales from the Galaxy’s Edge to Include VR Short ‘Temple of Darkness’

Blockchain News1 day ago

Bitcoin Suffers Mild Drop but Analyst Who Predicted Decoupling Expects BTC Price to See Bullish Uptrend

Blockchain News1 day ago

AMD Purchases Xilinx in All-Stock Transaction to Develop Mining Devices

Cyber Security1 day ago

Newly Launched Cybersecurity Company Stairwell

AI1 day ago

How 5G Will Impact Customer Experience?

AR/VR1 day ago

You can now Request the PlayStation VR Camera Adaptor for PS5

Blockchain News1 day ago

HSBC and Wave Facilitate Blockchain-Powered Trade Between New Zealand and China

Blockchain News1 day ago

Aave Makes History as Core Developers Transfer Governance to Token Holders

Blockchain News1 day ago

Caitlin Long’s Avanti Becomes the Second Crypto Bank in the US, Open for Commercial Clients in Early 2021

Blockchain News1 day ago

KPMG Partners with Coin Metrics to Boost Institutional Crypto Adoption

Blockchain News1 day ago

US SEC Executive Who said Ethereum is Not a Security to Leave the Agency

Blockchain News1 day ago

MicroStrategy Plans to Purchase Additional Bitcoin Reserves With Excess Cash

Covid191 day ago

How followers on Instagram can help to navigate your brand during a pandemic

Cyber Security2 days ago

StackRox Announced the Release of KubeLinter to Identify Misconfigurations in Kubernetes

Cyber Security2 days ago

How Was 2020 Cyber Security Awareness Month?

Ecommerce2 days ago

Masks and More Outlet Donates Face Masks For Children In Local…

Ecommerce2 days ago

Clicks Overtake Bricks: PrizeLogic & SmartCommerce Bring Shoppable…

Ecommerce2 days ago

Footwear Sales in the U.S. Expected to Stabilize and Bounce Back…

Ecommerce2 days ago

Celerant Technology® Expands NILS™ Integration Enabling Retailers…

Ecommerce2 days ago

The COVID-19 Pandemic Causes Eating Patterns in America to Take a…

Ecommerce2 days ago

MyJane Collaborates with Hedger Humor to Bring Wellness and Laughter…

AR/VR2 days ago

Sci-fi Shooter Hive Slayer is Free, Asks Players for Louisiana Hurricane Relief Donations Instead

AR/VR2 days ago

AMD Announces Radeon RX 6000-series GPUs with USB-C “for a modern VR experience”

Trending