Connect with us


Optimizing your engagement marketing with personalized recommendations using Amazon Personalize and Braze




Today’s marketer has a wide array of channels to communicate with their customers. However, sending the right message to the right customer on the right channel at the right time remains the preeminent challenge marketers face. In this post, I show you how to combine Braze, a customer engagement platform built on AWS for today’s on-demand, always-connected customers, and Amazon Personalize to meet this challenge and deliver experiences that surprise and delight your customers.

Braze makes it easy to organize your customers into audiences, which update in real-time, based on their behavior and profile traits. Messaging campaigns are created to target audiences through messaging channels such as email, SMS, and push notifications. Multi-step and multi-channel engagement journeys can also be designed using Braze Canvas. Campaigns and Canvases are triggered manually, based on a schedule, or even due to customer actions. However, your ability to personalize messages sent to customers is limited to what is available in their profile. Including product and content recommendations based on the learned interests of each customer as they engage with your web and mobile application is needed to truly personalize each message.

Amazon Personalize is an AWS service that uses machine learning algorithms to create recommender systems based on the behavioral data of your customers. The recommenders are private to your AWS account and based only on the data you provide. Through the Braze Connected Content feature, you are able to connect Braze to the same Amazon Personalize recommenders used to power recommendations in your web and mobile application. Since Amazon Personalize is able to adjust recommendations for each customer based on their behavior in real-time, the messages sent through Braze reflect their current preferences and intent.

Overview of solutions

I present two architectures in this post: one that uses the real-time capabilities of Braze and Amazon Personalize, and another that trades some of the freshness of real-time recommendations for a more cost-effective batch approach. The approach you select should match the goals of your engagement strategy and the scale of your messaging needs. Fortunately, the features and integration options of Braze and Amazon Personalize provide the flexibility to suit your operational requirements.

Real-time integration

We start with a real-time integration architecture. The following diagram depicts the relevant components of a sample ecommerce application in which you use Amazon Personalize to provide machine learning (ML)-powered recommenders, referred to as solutions. The primary data used to build solutions is user-item interaction history. For an ecommerce application, this includes events such as viewing a product, adding a product to a shopping cart, and purchasing a product. When rich metadata on events, items, and users is available, you can incorporate it to further improve the relevance of recommendations from the recommender. Examples of metadata include device type, location, and season for events; category, genre, and price point for items; and users’ age, gender, and subscription tier. After you create solutions, you can create autoscaling API endpoints called campaigns with just a few clicks to retrieve personalized recommendations.

Later in this post, I show you how to deploy this application in your AWS account. A self-guided workshop is also packaged with the application that you use to walk through sending personalized emails with Braze.

Our example ecommerce application retrieves personalized recommendations from a Recommendations microservice that appends the recommended item IDs from Amazon Personalize with rich product information from a Products microservice. As users engage with the application and indicate interest by viewing a product, adding a product to their shopping cart, or purchasing a product, events representing these actions are streamed to Amazon Personalize via the AWS Amplify JavaScript client library where Amazon Personalize automatically adjusts recommendations in real time based on user activity.

With personalization built into the application, you can connect Amazon Personalize with Braze to deliver personalized recommendations through outbound engagement channels such as email, SMS, and push notifications.

Braze allows you to create message templates that use the Liquid templating language to substitute placeholders in your template with values from a customer’s profile or even from an external resource. In the real-time architecture, we use the Recommendations microservice from the sample application as the external resource and Braze Connected Content as the feature to retrieve personalized recommendations to include in your message templates. The following Connected Content Liquid tag, placed at the top of your message, illustrates how to call the Recommendations service from Braze to retrieve recommendations for a user:

{% connected_content http://<RecommendationsServiceHostName>/recommendations?userID={{${user_id}}}&fullyQualifyImageUrls=1&numResults=4 :save result %}

The tag has the following elements:

  • Liquid tags are framed within {% and %} This allows you to embed tags and expressions inside message templates that may also contain text or HTML.
  • The tag type is declared just after the start of the tag. In this case, connected_content is the tag type. For the full list of supported tags, see Personalization Using Liquid Tags.
  • You next define a fully-qualified URL to the HTTP resource that Connected Content calls for each user. You replace <RecommendationsServiceHostName> with the host name for the Elastic Load Balancer for the Recommendations service in your deployment of the sample application.
  • The Recommendations service provides a few resources for different personalization features. The resource for user recommendations is accessed from the /recommendations path.
  • The query string parameters come next. The user is identified by the userID parameter, and the {{${user_id}}} expression instructs Braze to interpolate the user’s ID for each call to the service.
  • The last two query string parameters, fullyQualifyImageUrls=1 and numResults=4, tell the Recommendations service that we want the product image URLs to be fully qualified so they can be displayed in the user’s email client and, in this case, to only return the top four recommendations, respectively.
  • The :save result expression tells Braze to assign the JSON response from the Recommendations service to a template variable named result. With the response saved, you can then access elements of the response using Liquid tags in the rest of the template.

The following code shows the format of a response from the Recommendations service:

[ { "product": { "id": "2", "url": "", "sk": "", "name": "Striped Shirt", "category": "apparel", "style": "shirt", "description": "A classic look for the summer season.", "price": 9.99, "image": "", "featured": "true" } }, { "product": { "id": "1", "url": "", "sk": "", "name": "Black Leather Backpack", "category": "accessories", "style": "bag", "description": "Our handmade leather backpack will look great at the office or out on the town.", "price": 109.99, "image": "", "featured": "true" } }, ... ]

For brevity, the preceding code only shows the first two recommended products. Several product attributes are available that you can use in the Braze message template to represent each recommendation. To access a specific element of an array or list as we have here, you can use array subscripting notation in your Liquid tag. For example, the following tag interpolates the product name for the first recommended product in the response. For the preceding sample response, the tag resolves to “Striped Shirt”:


When you combine the information in the personalized recommendation response from the Recommendations service with Liquid tags, the possibilities for building message designs are endless. The following code is a simplified example of how you could display a product recommendation in an HTML email template:

<table> <tr> <td> <a href="{{result[0].product.url}}" target="_blank"> <img src="{{result[0].product.image}}" width="200" alt="{{result[0]}}" /> </a> </td> <td> <h2>{{result[0]}}</h2> <p>{{result[0].product.description}}</p> <p>Only <strong>$ {{result[0].product.price}}</strong>!</p> <a class="button" href="{{result[0].product.url}}">Buy Now</a> </td> </tr>

Batch integration

The batch integration architecture replaces the use of the Braze Connected Content feature with an Amazon Personalize batch recommendations job that is used to push attribute updates to Braze. Batch recommendations involve creating a file in an Amazon Simple Storage Service (Amazon S3) bucket that includes the users who you wish to generate recommendations for. A reference to this file is then used to submit a job to Amazon Personalize to generate recommendations for each user in the file and output the results to another Amazon S3 file of your choosing. You can use the output of the batch recommendations job to associate personalized recommendations with user profiles in Braze as custom attributes. The Liquid tags in the message templates we saw earlier are changed to access the recommendations as custom attributes from the user profile rather than the Connected Content response.

As noted earlier, the trade-off you’re making with the batch approach is sacrificing the freshness of real-time recommendations for a more cost-effective solution. Because batch recommendations don’t require an Amazon Personalize campaign, the additional requests from Connected Content to your campaign for each user are eliminated. For Braze campaigns that target extremely large segments, this can result in a significant reduction in requests. Furthermore, if you don’t need an Amazon Personalize campaign for other purposes or you’re creating an Amazon Personalize solution dedicated to email personalization, you can forego creating a campaign entirely.

The following diagram illustrates one of the many possible approaches to designing a batch architecture. The web application components from the real-time architecture still apply; they are excluded from this diagram for brevity.

You use Amazon CloudWatch Events to periodically trigger an AWS Lambda function that builds an input file for an Amazon Personalize batch recommendations job. When the batch recommendations job is complete, another Lambda function processes the output file, decorates the recommended items with rich product information, and enqueues user update events in Amazon Kinesis Data Streams. Finally, another Lambda function consumes the stream’s events and uses the Braze User API to update user profiles.

The use of a Kinesis data stream provides a few key benefits, including decoupling the batch job from the transactional Braze user update process and the ability to pause, restart, and replay user update events.

Real-time integration walkthrough

You implement the real-time integration in the Retail Demo Store sample ecommerce application. In this post, we walk you through the process of deploying this project in your AWS account and describe how to launch the self-guided Braze workshop bundled with the application.

You complete the following steps:

  1. Deploy the Retail Demo Store project to your AWS account using the supplied AWS CloudFormation templates (25–30 minutes).
  2. Build Amazon Personalize solutions and campaigns that provide personalized recommendations (2 hours).
  3. Import users into Braze and build a Braze campaign that uses Connected Content to retrieve personalized recommendations from Amazon Personalize (1 hour).
  4. Clean up resources.


For this walkthrough, you need the following prerequisites:

  • An AWS account
  • A user in your AWS account with the necessary privileges to deploy the project
  • A Braze account

If you don’t have a Braze account, please contact your Braze representative. We also assume that you have completed at least the Getting Started with Braze LAB course.

Step 1: Deploying the Retail Demo Store to your AWS account

From the following table, choose Launch Stack in the Region of your choice. This list of Regions doesn’t represent all possible Regions where you can deploy the project, just the Regions currently configured for deployment.

Region Launch
US East (N. Virginia)
US West (Oregon)
Europe (Ireland)

Accept all the default template parameter values and launch the template. The deployment of the project’s resources takes 25–30 minutes.

Step 2: Building Amazon Personalize campaigns

Before you can provide personalized product recommendations, you first need to train the ML models and provision the inference endpoints in Amazon Personalize that you need to retrieve recommendations. The CloudFormation template deployed in Step 1 includes an Amazon SageMaker notebook instance that provides a Jupyter notebook with detailed step-by-step instructions. The notebook takes approximately 2 hours to complete.

  1. Sign in to the AWS account where you deployed the CloudFormation template in Step 1.
  2. On the Amazon SageMaker console, choose Notebook instances.
  3. If you don’t see the RetailDemoStore notebook instance, make sure you’re in the same Region where you deployed the project.
  4. To access the notebook instance, choose Open Jupyter or Open JupyterLab.
  5. When the Jupyter web interface is loaded for the notebook instance, choose the workshop/1-Personalization/1.1-Personalize.ipynb.

The notebooks are organized in a directory structure, so you may have to choose the workshop folder to see the notebook subdirectories.

  1. When you have the 1.1-Personalize notebook open, step through the workshop by reading and running each cell.

You can choose Run from the Jupyter toolbar sequentially run the code in the cells.

Step 3: Sending personalized messages from Braze

With the Amazon Personalize solutions and campaigns to produce personalized recommendations in place, you can now import users into your Braze account, build a messaging template that uses Braze Connected Content to retrieve recommendations from Amazon Personalize, and build a Braze campaign to send targeted emails to your users.

Similar to the Personalization workshop in Step 1, the Braze messaging workshop steps you through the process. This notebook takes approximately 1 hour to complete.

  1. If necessary, repeat the instructions in Step 1 to open a Jupyter or JupyterLab browser window from the Amazon SageMaker notebook instance in your Retail Demo Store deployment.
  2. When the Jupyter web interface is loaded for the notebook instance, choose the workshop/4-Messaging/4.2-Braze.ipynb notebook.

As with before, you may have to choose the workshop folder to see the notebook subdirectories.

  1. When you have the 4.2-Braze notebook open, step through the workshop by reading and running each cell.

Step 4: Cleaning up

To avoid incurring future charges, delete the resources the Retail Demo Store project created by deleting the CloudFormation stack you used during deployment. For more information about the source code for this post and the full Retail Demo Store project, see the GitHub repo.


As marketers compete for the attention of customers through outbound messaging, there is increasing pressure to effectively target the right users, at the right time, on the right channel, and with the right messaging. Braze provides the solution to the first three challenges. You can solve the final challenge with Braze Connected Content and Amazon Personalize, and deliver highly personalized product and content recommendations that reflect each customer’s current interests.

How are you using outbound messaging to reach your customers? Is there an opportunity to increase engagement with your customers with more relevant and personalized content?

About Braze

Braze is an AWS Advanced Technology Partner and holder of the AWS Digital Customer Experience and Retail competencies. Top global brands such as ABC News, Urban Outfitters, Rakuten, and Gap are sending tens of billions of messages per month to over 2 billion monthly active users with Braze.

About the Author

James Jory is a Solutions Architect in Applied AI with AWS. He has a special interest in personalization and recommender systems and a background in ecommerce, marketing technology, and customer data analytics. In his spare time, he enjoys camping and auto racing simulation.



How 5G Will Impact Customer Experience?




5G is the breakthrough technology promised to bring new innovations, change the way people are traversing through the Internet with its faster connection speeds, lower latency, high bandwidth, and ability to connect one million devices per square kilometre. Telcos are deploying 5G to enhance our day-to-day lives.

“When clubbed with other technologies like Artificial Intelligence, Internet of Things (IoT), it could mean a lot to a proliferation of other technologies like AR/VR, data analytics.” 

5G can be a boon for businesses with the delivery of increased reliability, efficiency and performance if it can be used to drive more value to the customers as well as the business stakeholders and meet their expectations with the help of digital technologies as mentioned below:

Consumer Expectations are on the Rise

In modern days, customer service teams provide and manage customer support via call centres and digital platforms. The rollout of 5G is expected to unleash more benefits with a positive impact on customer service as they improve their present personalized service offerings to customers and allow it to further create new solutions that could develop their customer engagement to win great deals.

For instance, salespeople in a retail store are being imbibed with layers of information about customers’ behaviour and choices that will help them build a rich and tailored experience for the customers walking down the store.

Video Conferencing/streaming is Just a Few Clicks Away

Video support is considered to be a critical part of Consumer Experience (CX) and will open new avenues for consumer-led enterprises.

“As per a survey conducted by Oracle with 5k people, 75% of people understand the efficiency and value of video chat and voice calls.” 

CX representatives used the video support feature to troubleshoot highly technical situations through video chat and screen sharing options with few clicks, potentially reducing the number of in-house technician visits during critical situations like coronavirus pandemic.

Also, nowadays video conferencing is facilitated with an option to record a quick instant video describing the process/solution and discarding the long process of sending step-by-step emails. Enterprises can develop advanced user guide for troubleshooting issues featuring video teasers for resolving common problems.

However, high-definition video quality is preferable for video conferencing, chat and demands for an uninterrupted network with smooth video streaming. This means operators need to carry out network maintenance activities on regular intervals to check whether there is any kind of 5G PIM formation on these network cell towers that could reduce receive sensitivity and performance, thereby deteriorating network speed, video resolution etc.

Thus, PIM testing becomes critical for delivering enhanced network services without interference, necessary for high-resolution online video conferencing, chats, and many more.

Increased Smart Devices and the Ability to Troubleshoot via Self-Service

The inception of 5G will give a boost to the IoT and smart device market which is already growing.

These smart devices IoT connections are expected to become twice in number between 2019 and 2025 i.e. more than 25Bn as per the GSM association which is an industry organization representing telecom operators across the globe.

With lower latency and improvisation in reliability, 5G has a lot more to offer as it connects a large number of devices. This will ultimately curb the manpower needed for customer support thereby reducing labour costs for the enterprise. Moreover, these IoT connected devices and high-speed network of 5G permit consumers to self-troubleshoot these devices at their own homes.

In order to facilitate these high-resolution networks, telecom operators need to perform 5G network testing and identify issues, take corrective actions that could improve their network and integrate with advanced capabilities, making it more efficient than previous connections with the wider network coverage.

Enhanced Augmented Reality (AR) / Virtual Reality (VR) Capabilities

As these tools are being widely used, customers are provided with virtual stores or immersive experiences using AR to view a sneak peek of the products in their house in real-time.

“‘Augmented Retail: The New Consumer Reality’ study by Nielsen in 2019 suggested that AR/VR has created a lot of interest in people and they are willing to use these technologies to check out products.” 

Analysis of Bulk Data With Big Data Analytics

Enterprises have to deal with a huge volume of data daily. 5G has the ability to collect these data and with its advanced network connectivity across a large number of devices, it delivers faster data analytics too.

Companies will be able to process this vast amount of unstructured data sets combined with Artificial Intelligence (AI) to extract meaningful insights and use them for drafting business strategies like using customer behaviour data sets to study their buying behaviour and targeting such segment with customized service offerings as per their requirement.

As per Ericsson’s AI in networks report, 68% of Communications Service Providers (CSPs) believe improving CX is a business objective while more than half of them already believe AI will be a key technology that will assist in improving the overall CX. Thus, big data analytics will be crucial for harnessing all new data and enhance the customer experience.


Looking from a CX point of view, 5G benefits will far extend beyond the experience of a citizen. Real-time decisions will accelerate with the prevalence of 5G and application of other new-age technologies like AI, ML, IoT, etc. As 5G deployment will continue to grow, so is the transition of each trending processes mentioned above that will ultimately improve your business in terms of productivity, gain a large customer base and bring more revenues.


Continue Reading


Resiliency And Security: Future-Proofing Our AI Future




Deploying AI in the enterprise means thinking forward for resiliency and security (GETTY IMAGES)

By Allison Proffitt, AI Trends

On the first day of the Second Annual AI World Government conference and expo held virtually October 28-30, a panel moderated by Robert Gourley, cofounder & CTO of OODA, raised the issue of AI resiliency. Future-proofing AI solutions requires keeping your eyes open to upcoming likely legal and regulatory roadblocks, said Antigone Peyton, General Counsel & Innovation Strategist at Cloudigy Law. She takes a “use as little as possible” approach to data, raising questions such as: How long do you really need to keep training data? Can you abstract training data to the population level, removing some risk while still keeping enough data to find dangerous biases?

Stephen Dennis, Director of Advanced Computing Technology Centers at the U.S. Department of Homeland Security, also recommended a forward-looking posture, but in terms of the AI workforce. In particular, Dennis challenged the audience to consider the maturity level of the users of new AI technology. Full automation is not likely a first AI step, he said. Instead, he recommends automating slowly, bringing the team along. Take them a technology that works in the context they are used to, he said. They shouldn’t need a lot of training. Mature your team with the technology. Remove the human from the loop slowly.

Of course, some things will never be fully automated. Brian Drake, U.S. Department of Defense, pointed out that some tasks are inherently human-to-human interactions—such as gathering human intelligence. But AI can help humans do even those tasks better, he said.

He also cautioned enterprises to consider their contingency plan as they automate certain tasks. For example, we rarely remember phone numbers anymore. We’ve outsourced that data to our phones while accepting a certain level of risk. If you deploy a tool that replaces a human analytic activity, that’s fine, Drake said. But be prepared with a contingency plan, a solution for failure.   

Organizing for Resiliency

All of these changes will certainly require some organizational rethinking, the panel agreed. While government is organized in a top down fashion, Dennis said, the most AI-forward companies—Uber, Netflix—organize around the data. That makes more sense, he proposed, if we are carefully using the data.

Data models—like the new car trope—begin degrading the first day they are used. Perhaps the source data becomes outdated. Maybe an edge use case was not fully considered. The deployment of the model itself may prompt a completely unanticipated behavior. We must capture and institutionalize those assessments, Dennis said. He proposed an AI quality control team—different from the team building and deploying algorithms—to understand degradation and evaluate the health of models in an ongoing way. His group is working on this with sister organizations in cyber security, and he hopes the best practices they develop can be shared to the rest of the department and across the government.

Peyton called for education—and reeducation—across organizations. She called the AI systems we use today a “living and breathing animal”. This is not, she emphasized, an enterprise-level system that you buy once and drop into the organization. AI systems require maintenance, and someone must be assigned to that caretaking.

But at least at the Department of Defense, Drake pointed out, all employees are not expected to become data scientists. We’re a knowledge organization, he said, but even if reskilling and retraining are offered, a federal workforce does not have to universally accept those opportunities. However, surveys across DoD have revealed an “appetite to learn and change”, Drake said. The Department is hoping to feed that curiosity with a three-tiered training program offering executive-level overviews, practitioner-level training on the tools currently in place, and formal data science training. He encouraged a similar structure to AI and data science training across other organizations.

Bad AI Actors

Gourley turned the conversation to bad actors. The very first telegraph message between Washington DC and Baltimore in 1844 was an historic achievement. The second and third messages—Gourley said—were spam and fraud. Cybercrime is not new and it is absolutely guaranteed in AI. What is the way forward, Gourley asked the panel.

“Our adversaries have been quite clear about their ambitions in this space,” Drake said. “The Chinese have published a national artificial intelligence strategy; the Russians have done the same thing. They are resourcing those plans and executing them.”

In response, Drake argued for the vital importance of ethics frameworks and for the United States to embrace and use these technologies in an “ethically up front and moral way.” He predicted a formal codification around AI ethics standards in the next couple of years similar to international nuclear weapons agreements now.


Continue Reading


AI Projects Progressing Across Federal Government Agencies




The AI World Government Conference kicked off virtually on Oct. 28 and continues on Oct. 29 and 30. Tune in to learn about AI strategies and plans of federal agencies. (Credit: Getty Images)

By AI Trends Staff

Government agencies are gaining experience with AI on projects, with practitioners focusing on defining the project benefit and the data quality is good enough to ensure success. That was a takeaway from talks on the opening day of the Second Annual AI World Government conference and expo held virtually on October 28.

Wendy Martinez, PhD, director of the Mathematical Statistics Research Center, US Bureau of Labor Statistics

Wendy Martinez, PhD, director of the Mathematical Statistics Research Center, with the Office of Survey Methods Research in the US Bureau of Labor Statistics, described a project to use natural language understanding AI to parse text fields of databases, and automatically correlate them to job occupations in the federal system. One lesson learned was despite interest in sharing experience with other agencies, “You can’t build a model based on a certain dataset and use the model somewhere else,”  she stated. Instead, each project needs its own source of data and model tuned to it.

Renata Miskell, Chief Data Officer in the Office of the Inspector General for the US Department of Health and Human Services, fights fraud and abuse for an agency that oversees over $1 trillion in annual spending, including on Medicare and Medicaid. She emphasized the importance of ensuring that data is not biased and that models generate ethical recommendations. For example, to track fraud in its grant programs awarding over $700 billion annually, “It’s important to understand the data source and context,” she stated. The unit studied five years of data from “single audits” of individual grant recipients, which included a lot of unstructured text data. The goal was to pass relevant info to the audit team. “It took a lot of training, she stated. “Initially we had many false positives.” The team tuned for data quality and ethical use, steering away from blind assumptions. “If we took for granted that the grant recipients were high risk, we would be unfairly targeting certain populations,” Miskell stated.

Dave Cook, senior director of AI/ML Engineering Services, Figure Eight Federal

In the big picture, many government agencies are engaged in AI projects and a lot of collaboration is going on. Dave Cook is senior director of AI/ML Engineering Services for Figure Eight Federal, which works on AI projects for federal clients. He has years of experience working in private industry and government agencies, mostly now the Department of Defense and intelligence agencies. “In AI in the government right now, groups are talking to one another and trying to identify best practices around whether to pilot, prototype, or scale up,” he said. “The government has made some leaps over the past few years, and a lot of sorting out is still going on.”

Ritu Jyoti, Program VP, AI Research and Global AI Research lead for IDC consultants, program contributor to the event, has over 20 years of experience working with companies including EMC, IBM Global Services, and PwC Consulting. “AI has progressed rapidly,” she said. From a global survey IDC conducted in March, business drivers for AI adoption were found to be better customer experience, improved employee productivity, accelerated innovation and improved risk management. A fair number of AI projects failed. The main reasons were unrealistic expectations, the AI did not perform as expected, the project did not have access to the needed data, and the team lacked the necessary skills. “The results indicate a lack of strategy,” Joti stated.

David Bray, PhD, Inaugural Director of the nonprofit Atlantic Council GeoTech Center, and a contributor to the event program, posted questions on how data governance challenges the future of AI. He asked what questions practitioners and policymakers around AI should be asking, and how the public can participate more in deciding what can be done with data. “You choose not to be a data nerd at your own peril,” he said.

Anthony Scriffignano, PhD, senior VP & Chief Data Scientist with Dun & Bradstreet, said in the pandemic era with many segments of the economy shut down, companies are thinking through and practicing different ways of doing things. “We sit at the point of inflection. We have enough data and computer power to use the AI techniques invented generations ago in some cases,” he said. This opportunity poses challenges related to what to try and what not to try, and “sometimes our actions in one area cause a disruption in another area.”

AI World Government continues tomorrow and Friday.

(Ed. Note: Dr. Eric Schmidt, former CEO of Google is now chair of the National Security Commission on AI, today was involved in a discussion, Transatlantic Cooperation Around the Future of AI, with Ambassador Mircea Geoana, Deputy Secretary General, North Atlantic Treaty Organization, and Secretary Robert O. Work, vice chair of the National Security Commission. Convened by the Atlantic Council, the event can be viewed here.)


Continue Reading
zephyrnet31 mins ago

Trends, Insights & Startups from The Fintech 250

Cannabis5 hours ago

Current Research on Effect Specific Uses of Cannabis

Covid196 hours ago

How Telemedicine Can Help Keep Your Health on Track

Start Ups7 hours ago

Website Packages – Good or Evil?

Blockchain7 hours ago

Self-Sovereign Decentralized Digital Identity

Cyber Security14 hours ago

Best Moon Lamp Reviews and Buying Guide

Cyber Security17 hours ago

Guilford Technical Community College Continues to Investigate a Ransomware Cyberattack

Cyber Security19 hours ago

IOTW: Will There Be An Incident Of Impact On Tuesday’s Election?

Blockchain News22 hours ago

Mastercard and GrainChain Bring Blockchain Provenance to Commodity Supply Chain in Americas

AR/VR1 day ago

Win a Copy of Affected: The Manor for Oculus Quest

AR/VR1 day ago

The Steam Halloween Sale has Begun With Themed Activities and Updates

AR/VR1 day ago

Warhammer Age of Sigmar: Tempestfall Announced for PC VR & Oculus Quest, Arrives 2021

Crowdfunding1 day ago

I Dare You to Ignore This Trend

Blockchain News1 day ago

Bitcoin Price Flashes $750M Warning Sign As 60,000 BTC Options Set To Expire

AR/VR1 day ago

Star Wars: Tales from the Galaxy’s Edge to Include VR Short ‘Temple of Darkness’

Blockchain News1 day ago

Bitcoin Suffers Mild Drop but Analyst Who Predicted Decoupling Expects BTC Price to See Bullish Uptrend

Blockchain News1 day ago

AMD Purchases Xilinx in All-Stock Transaction to Develop Mining Devices

Cyber Security1 day ago

Newly Launched Cybersecurity Company Stairwell

AI1 day ago

How 5G Will Impact Customer Experience?

AR/VR1 day ago

You can now Request the PlayStation VR Camera Adaptor for PS5

Blockchain News1 day ago

HSBC and Wave Facilitate Blockchain-Powered Trade Between New Zealand and China

Blockchain News1 day ago

Aave Makes History as Core Developers Transfer Governance to Token Holders

Blockchain News1 day ago

Caitlin Long’s Avanti Becomes the Second Crypto Bank in the US, Open for Commercial Clients in Early 2021

Blockchain News1 day ago

KPMG Partners with Coin Metrics to Boost Institutional Crypto Adoption

Blockchain News1 day ago

US SEC Executive Who said Ethereum is Not a Security to Leave the Agency

Blockchain News1 day ago

MicroStrategy Plans to Purchase Additional Bitcoin Reserves With Excess Cash

Covid191 day ago

How followers on Instagram can help to navigate your brand during a pandemic

Cyber Security2 days ago

StackRox Announced the Release of KubeLinter to Identify Misconfigurations in Kubernetes

Cyber Security2 days ago

How Was 2020 Cyber Security Awareness Month?

Ecommerce2 days ago

Masks and More Outlet Donates Face Masks For Children In Local…

Ecommerce2 days ago

Clicks Overtake Bricks: PrizeLogic & SmartCommerce Bring Shoppable…

Ecommerce2 days ago

Footwear Sales in the U.S. Expected to Stabilize and Bounce Back…

Ecommerce2 days ago

Celerant Technology® Expands NILS™ Integration Enabling Retailers…

Ecommerce2 days ago

The COVID-19 Pandemic Causes Eating Patterns in America to Take a…

Ecommerce2 days ago

MyJane Collaborates with Hedger Humor to Bring Wellness and Laughter…

AR/VR2 days ago

Sci-fi Shooter Hive Slayer is Free, Asks Players for Louisiana Hurricane Relief Donations Instead

AR/VR2 days ago

AMD Announces Radeon RX 6000-series GPUs with USB-C “for a modern VR experience”

AI2 days ago

Resiliency And Security: Future-Proofing Our AI Future

AI2 days ago

AI Projects Progressing Across Federal Government Agencies

Blockchain2 days ago

Kucoin and Revain Announce Partnership