Connect with us

Big Data

Deploy Machine Learning Pipeline on AWS Fargate




Deploy Machine Learning Pipeline on AWS Fargate

A step-by-step beginner’s guide to containerize and deploy ML pipeline serverless on AWS Fargate.

By Moez Ali, Founder & Author of PyCaret


In our last post on deploying a machine learning pipeline in the cloud, we demonstrated how to develop a machine learning pipeline in PyCaret, containerize it with Docker and serve it as a web application using Google Kubernetes Engine. If you haven’t heard about PyCaret before, please read this announcement to learn more.

In this tutorial, we will use the same machine learning pipeline and Flask app that we built and deployed previously. This time we will demonstrate how to containerize and deploy a machine learning pipeline serverless using AWS Fargate.

👉 Learning Goals of this Tutorial

  • What is a Container? What is Docker? What is Kubernetes?
  • What is Amazon Elastic Container Service (ECS)?
  • What are AWS Fargate and serverless deployment?
  • Build and push a Docker image onto Amazon Elastic Container Registry.
  • Create and execute a task definition using AWS-managed infrastructure i.e. AWS Fargate.
  • See a web app in action that uses a trained machine learning pipeline to predict new data points in real-time.

This tutorial will cover the entire workflow starting from building a docker image locally, uploading it onto Amazon Elastic Container Registry, creating a cluster and then defining and executing task using AWS-managed infrastructure i.e. AWS Fargate.

In the past, we have covered deployment on other cloud platforms such as Azure and Google. If you are interested in learning more about those, you can read the following stories:

💻 Toolbox for this tutorial


PyCaret is an open source, low-code machine learning library in Python that is used to train and deploy machine learning pipelines and models into production. PyCaret can be installed easily using pip.

pip install pycaret


Flask is a framework that allows you to build web applications. A web application can be a commercial website, blog, e-commerce system, or an application that generates predictions from data provided in real-time using trained models. If you don’t have Flask installed, you can use pip to install it.

Docker Toolbox for Windows 10 Home

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers are used to package up an application with all of its necessary components, such as libraries and other dependencies, and ship it all out as one package. If you haven’t used docker before, this tutorial also covers the installation of Docker Toolbox (legacy) on Windows 10 Home. In the previous tutorial we covered how to install Docker Desktop on Windows 10 Pro edition.

Amazon Web Services (AWS)

Amazon Web Services (AWS) is a comprehensive and broadly adopted cloud platform, offered by Amazon. It has over 175 fully-featured services from data centers globally. If you haven’t used AWS before, you can sign-up for a free account.

✔️Let’s get started…..

What is a Container?

Before we get into implementation using AWS Fargate, let’s understand what a container is and why we would need one?

Have you ever had the problem where your code works fine on your computer but when a friend tries to run the exact same code, it doesn’t work? If your friend is repeating the exact same steps, he or she should get the same results, right? The one-word answer to this is the environmentYour friend’s environment is different than yours.

What does an environment include? → The programing language such as Python and all the libraries and dependencies with the exact versions using which application was built and tested.

If we can create an environment that we can transfer to other machines (for example: your friend’s computer or a cloud service provider like Google Cloud Platform), we can reproduce the results anywhere. Hence, a container is a type of software that packages up an application and all its dependencies so the application runs reliably from one computing environment to another.

What is Docker?

Docker is a company that provides software (also called Docker) that allows users to build, run and manage containers. While Docker’s container are the most common, there are other less famous alternatives such as LXD and LXC.


Now that you theoretically understand what a container is and how Docker is used to containerize applications, let’s imagine a scenario where you have to run multiple containers across a fleet of machines to support an enterprise level machine learning application with varied workloads during day and night. This is pretty common for real-life and as simple as it may sound, it is a lot of work to do manually.

You need to start the right containers at the right time, figure out how they can talk to each other, handle storage considerations, deal with failed containers or hardware and million other things!

This entire process of managing hundreds and thousands of containers to keep the application up and running is known as container orchestration. Don’t get caught up in the technical details yet.

At this point, you must recognize that managing real-life applications require more than one container and managing all of the infrastructure to keep containers up and running is cumbersome, manual and an administrative burden.

This brings us to Kubernetes.

What is Kubernetes?

Kubernetes is an open-source system developed by Google in 2014 for managing containerized applications. In simple words, Kubernetes is a system for running and coordinating containerized applications across a cluster of machines.


Photo by chuttersnap on Unsplash

While Kubernetes is an open-source system developed by Google, almost all major cloud service providers offer Kubernetes as a Managed Service. For example: Amazon Elastic Kubernetes Service (EKS) offered by Amazon, Google Kubernetes Engine (GKE) offered by Googleand Azure Kubernetes Service (AKS) offered by Microsoft.

So far we have discussed and understood:
✔️ A container
✔️ Docker
✔️ Kubernetes

Before introducing AWS Fargate, there is only one thing left to discuss and that is Amazon’s own container orchestration service Amazon Elastic Container Service (ECS).

AWS Elastic Container Service (ECS)

Amazon Elastic Container Service (Amazon ECS) is Amazon’s home-grown container orchestration platform. The idea behind ECS is similar to Kubernetes (both of them are orchestration services).

ECS is an AWS-native service, meaning that it is only possible to use on AWS infrastructure. On the other hand, EKS is based on Kubernetes, an open-source project which is available to users running on multi-cloud (AWS, GCP, Azure) and even On-Premise.

Amazon also offers a Kubernetes based container orchestration service known as Amazon Elastic Kubernetes Service (Amazon EKS). Even though the purpose of ECS and EKS is pretty similar i.e. orchestrating containerized applications, there are quite a few differences in pricing, compatibility and security. There is no best answer and the choice of solution depends on the use-case.

Irrespective of whichever container orchestration service you are using (ECS or EKS), there are two ways you can implement the underlying infrastructure:

  1. Manually manage the cluster and underlying infrastructure such as Virtual Machines / Servers / (also known as EC2 instances in AWS).
  2. Serverless — Absolutely no need to manage anything. Just upload the container and that’s it. ← This is AWS Fargate.

Amazon ECS underlying infrastructure

AWS Fargate — serverless compute for containers

AWS Fargate is a serverless compute engine for containers that works with both Amazon Elastic Container Service (ECS) and Amazon Elastic Kubernetes Service (EKS). Fargate makes it easy for you to focus on building your applications. Fargate removes the need to provision and manage servers, lets you specify and pay for resources per application, and improves security through application isolation by design.

Fargate allocates the right amount of compute, eliminating the need to choose instances and scale cluster capacity. You only pay for the resources required to run your containers, so there is no over-provisioning and paying for additional servers.


How AWS Fargate works —

There is no best answer as to which approach is better. The choice between going serverless or manually managing an EC2 cluster depends on the use-case. Some pointers that can assist with this choice include:

ECS EC2 (Manual Approach)

  • You are all-in on AWS.
  • You have a dedicated Ops team in place to manage AWS resources.
  • You have an existing footprint on AWS i.e. you are already managing EC2 instances

AWS Fargate

  • You do not have huge Ops team to manage AWS resources.
  • You do not want operational responsibility or want to reduce it.
  • Your application is stateless (A stateless app is an application that does not save client data generated in one session for use in the next session with that client).

Setting the Business Context

An insurance company wants to improve its cash flow forecasting by better predicting patient charges using demographic and basic patient health risk metrics at the time of hospitalization.


To build and deploy a web application where the demographic and health information of a patient is entered into a web-based form which then outputs a predicted charge amount.


  • Train and develop a machine learning pipeline for deployment.
  • Build a web app using a Flask framework. It will use the trained ML pipeline to generate predictions on new data points in real-time.
  • Build and push a Docker image onto Amazon Elastic Container Registry.
  • Create and execute a task to deploy the app using AWS Fargate serverless infrastructure.

Since we have already covered the first two tasks in our initial tutorial, we will quickly recap them and then focus on the remaining items in the list above. If you are interested in learning more about developing a machine learning pipeline in Python using PyCaret and building a web app using a Flask framework, please read this tutorial.

👉 Develop a Machine Learning Pipeline

We are using PyCaret in Python for training and developing a machine learning pipeline which will be used as part of our web app. The Machine Learning Pipeline can be developed in an Integrated Development Environment (IDE) or Notebook. We have used a notebook to run the below code:

When you save a model in PyCaret, the entire transformation pipeline based on the configuration defined in the setup() function is created . All inter-dependencies are orchestrated automatically. See the pipeline and model stored in the ‘deployment_28042020’ variable:


Machine Learning Pipeline created using PyCaret

👉 Build a Web Application

This tutorial is not focused on building a Flask application. It is only discussed here for completeness. Now that our machine learning pipeline is ready we need a web application that can connect to our trained pipeline to generate predictions on new data points in real-time. We have created the web application using Flask framework in Python. There are two parts of this application:

  • Front-end (designed using HTML)
  • Back-end (developed using Flask)

This is how our web application looks:


Web application on local machine

If you haven’t followed along so far, no problem. You can simply fork this repository from GitHub. This is how your project folder should look at this point:

10-steps to deploy a ML pipeline using AWS Fargate:

👉 Step 1 — Install Docker Toolbox (for Windows 10 Home)

In order to build a docker image locally, you will need Docker installed on your computer. If you are using Windows 10 64-bit: Pro, Enterprise, or Education (Build 15063 or later) you can download Docker Desktop from DockerHub.

However, if you are using Windows 10 Home, you would need to install the last release of legacy Docker Toolbox (v19.03.1) from Dockers GitHub page.

Download and Run DockerToolbox-19.03.1.exe file.

The easiest way to check if the installation was successful is by opening the command prompt and typing in ‘docker’. It should print the help menu.


Anaconda Prompt to check docker

👉 Step 2— Create a Dockerfile

The first step for creating a Docker image is to create a Dockerfile in the project directory. A Dockerfile is just a file with a set of instructions. The Dockerfile for this project looks like this:

A Dockerfile is case-sensitive and must be in the project folder with the other project files. A Dockerfile has no extension and can be created using any text editor. You can download the Dockerfile used in this project from this GitHub Repository.

👉 Step 3— Create a Repository in Elastic Container Registry (ECR)

(a) Login to your AWS console and search for Elastic Container Registry:


AWS Console

(b) Create a new repository:


Create New Repository on Amazon Elastic Container Registry

For this demo we have created ‘pycaret-deployment-aws-repository’.

(c) Click on “View push commands”:



(d) Copy Push Commands:


Push commands for pycaret-deployment-aws-repository

👉 Step 4— Execute push commands

Navigate to your project folder using Anaconda Prompt and execute the commands you have copied in the step above. The code below is for demonstration only and may not work as it is. To get the right code to execute, you must get a copy of code from “View push commands” inside the repository.

You must be in the folder where the Dockerfile and the rest of your code reside before executing these commands.

Command 1
aws ecr get-login-password --region ca-central-1 | docker login --username AWS --password-stdin 2
docker build -t pycaret-deployment-aws-repository .Command 3
docker tag pycaret-deployment-aws-repository:latest 4
docker push

👉 Step 5— Check your uploaded image

Click on the repository you created and you will see an image URI of the uploaded image in the step above. Copy the image URI (it would be needed in step 7 below).

👉 Step 6 — Create and Configure a Cluster

(a) Click on “Clusters” on left-side menu:


Create Cluster — Step 1

(b) Select “Networking only” and click Next step:


Select Networking Only Template

(c) Configure Cluster (Enter cluster name) and click on Create:


Configure Cluster

(d) Cluster Created:


Cluster Created

👉 Step 7— Create a new Task definition

task definition is required to run Docker containers in Amazon ECS. Some of the parameters you can specify in a task definition include: The Docker image to use with each container in your task. How much CPU and memory to use with each task or each container within a task.

(a) Click on “Create new task definition”:


Create a new task definition

(b) Select “FARGATE” as launch type:


Select Launch Type Compatibility

(c) Fill in the details:


Configure Task and container definitions (part 1)


Configure Task and container definitions (part 2)

(d) Click on “Add Containers” and fill in the details:


Adding Container in task definitions

(e) Click “Create Task” on the bottom right.

👉 Step 8 —Execute Task Definition

In step 7 we created a task that will start the container. Now we will execute the task by clicking “Run Task” under Actions.


Execute Task Definition

(a) Click on “Switch to launch type” to change the type to Fargate:


Running Task — Part 1

(b) Select the VPC and Subnet from the dropdown:


Running Task — Part 2

(c) Click on “Run Task” on bottom right:


Task Created Successfully

👉 Step 9— Allow inbound port 5000 from Network settings

One last step before we can see our application in action on Public IP address is to allow port 5000 by creating a new rule. In order to do that, follow these steps:

(a) Click on Task


(b) Click on ENI Id:

(c) Click on Security groups

(d) Click on “Edit inbound rules”

(e) Add a Custom TCP rule of port 5000

👉 Step 10 — See the app in action

Use public IP address with port 5000 to access the application.


Task definition logs


Final app uploaded on

Note: By the time this story is published, the app will be removed from the public address to restrict resource consumption.

PyCaret 2.0.0 is coming!

We have received overwhelming support and feedback from the community. We are actively working on improving PyCaret and preparing for our next release. PyCaret 2.0.0 will be bigger and better. If you would like to share your feedback and help us improve further, you may fill this form on the website or leave a comment on our GitHub or LinkedIn page.

Follow our LinkedIn and subscribe to our YouTube channel to learn more about PyCaret.

Want to learn about a specific module?

As of the first release 1.0.0, PyCaret has the following modules available for use. Click on the links below to see the documentation and working examples in Python.

Also see:

PyCaret getting started tutorials in Notebook:

Would you like to contribute?

PyCaret is an open source project. Everybody is welcome to contribute. If you would like contribute, please feel free to work on open issues. Pull requests are accepted with unit tests on dev-1.0.1 branch.

Please give us ⭐️ on our GitHub repo if you like PyCaret.


Bio: Moez Ali is a Data Scientist, and is Founder & Author of PyCaret.

Original. Reposted with permission.



Big Data

How Can Technology Help Fight the COVID-19 Pandemic?




Illustration: © IoT For All

As the COVID-19 pandemic continues unfolding, technology solutions and government initiatives are multiplying to help monitor and control the virus’s journey. Their aid includes reducing the load on the health system and reinforcing the efforts of overworking and burned-out healthcare workers.

While smart technologies cannot replace or compensate public institution measures, they do play a crucial role in emergency responses. Let’s take a look at the promising use cases of how technology can help fight the novel coronavirus outbreak.

Technologies Used for Good

People tend to think of technology as a heartless machine, which is true, but only until it’s used for good. Just look at all the wonderful things we’ve managed to do with its help.

Telemedicine is gaining traction by offering remote patient monitoring and interactive remote doctor’s visits. At the same time, 3D printing and open-source solutions are facilitating the production of more affordable face masks, ventilators, and breathing filters as well as optimizing the supply of the medical equipment. Even more, the pandemic has driven scientists to desperate measures. They are now experimenting with gene editing, synthetic biology, and nanotechnology to develop and test vaccines faster than ever in the history of humanity.

Smart technologies like the Internet of things (IoT), big data, and artificial intelligence (AI) are being massively adopted to help track the disease spread and contagion, manage insurance payments, uphold medical supply chains, and enforce restrictive measures. Let’s go step by step to see how IoT, AI, big data, and mobile solutions are actually enhancing medical care.

IoT for Smart Patient Care Management and Home Automation

IoT has already found its use among healthcare providers. Today, connected patient imaging, health devices or applications, worker solutions, and ambulance programs are being adopted globally. But COVID-19 made the technology take on new applications to help the world combat the epidemic. Tracking quarantine, pre-screening and diagnosing, cleaning and disinfecting, innovative usage of drones, reducing in-home infections, are all “new normals” thanks to IoT.


For example, an American health technology company Kinsa creates smart thermometers that screen and aggregate people’s temperature and symptoms data in real-time. Having gathered data from over one million connected thermometers, Kinsa rolled out its US HealthWeather™ Map.

The map is updated daily, highlighting how severely the population is being affected by influenza-like illness (ILI). This real-time information helps health authorities see an increase. In fevers as early indicators of the community spread of COVID-19 to streamline the allocation of health resources. These areas are marked in the “Atypical” mode of the map.

To slow down the spread of COVID-19, a team of Seattle engineers created Immutouch, a smart wristband vibrating every time a person wearing it tries to touch their face.


Smart speakers, lights, and security systems are being used to open doors and switch on lights to reduce in-home infections. These gadgets allow people to avoid touching the surfaces of doorknobs, switches, mail, packages, or anything that could easily spread germs.

The Role of Big Data in Fighting Coronavirus

Tapping into big data is a must to develop real-time forecasts and arm healthcare professionals with a profound database to help with decision-making.


IBM Clinical Development system is an advanced Electronic Data Capture (EDC) platform that allows an accelerated delivery of medications to market and reduces the time and cost of clinical trials thanks to cognitive computing, patient data assets, and IoT. Additionally, the U.S. government had been in active talks with Facebook, Google, and others to determine how to use location data to glean insights for combating the COVID-19 pandemic.

Could Mobile Apps be Used to Control the Pandemic?

The COVID-19 pandemic has become a game-changer for the healthcare continuum. Today’s mobile apps are on guard to help patients receive online therapy, at-home testing, conclude self-checks, and improve mental well-being. Thanks to smartphone apps, it is now possible to trace the virus’s journey and help limit its spread.

Apple COVID-19, for instance, was created in partnership with the Centers for Disease Control and Prevention (CDC), the White House, and the Federal Emergency Management Agency (FEMA). The application contains vital and relevant information from trusted sources on the coronavirus pandemic: hand hygiene practices, social distancing FAQs, quarantine guidelines, self-checking tutorials, tips on cleaning, and disinfecting surfaces. On top of that, it has a screening tool that advises people on what to do when a person has COVID-19 symptoms, has just returned from abroad, or has come in close contact with someone who might be infected with the disease.

Meanwhile, health authorities in Abu Dhabi have created the TraceCovid app for Bluetooth-enabled smartphones to minimize the spread of the disease. The service allows tracing individuals who have come into proximity with a person tested positive for COVID-19. Thanks to it, medical professionals сan react faster and render the necessary healthcare. Germany, in turn, is going to roll out a smartphone app, which will use Bluetooth to alert people if they are close to someone with the confirmed viral infection.


Telemedicine has also proved to be an efficient tool for flattening the curve. The Sheba Medical Centre, the largest Israeli hospital, launched a telehealth program for remote patient-monitoring to control the pandemic spread. Doctolib, a Franco-German company, Qare (France), Livi (Sweden), Push Doctor (the UK), Compugroup Medical (Germany) are offering virtual doctors too.

Using AI to Identify, Track and Forecast Outbreaks

Artificial intelligence-powered by natural language processing (NLP) and location monitoring is crucial for identifying, tracking, and scanning outbreaks, predicting hotspots and helping make better decisions.

For example, Microsoft collaborated with the U.S. Centers for Disease Control and Prevention (CDC) to create an AI-based COVID-19 Assessment bot to treat patients more effectively and allocate limited resources. The bot, nicknamed Clara, can evaluate symptoms, advice on the next steps to take and track users who need urgent care the most.

The Canadian startup BlueDot has applied AI to spot and track the spread of COVID-19 and predict outbreaks, and the Japanese company Bespoke rolled out Bebot, an AI-powered chatbot that was developed specifically for travelers. This mobile app informs and assists them with coronavirus-related questions as they move about.


There’s no doubt that the coronavirus pandemic has become a real-life test for everyone. It has caused tremendous damage, but at the same time, it has forced tech innovators to roll out advanced solutions, and it seems that they don’t plan on slowing down anytime soon.

Healthcare providers across the globe are continually switching to smart technologies. So if you are in the smart technology niche, consider the current trends to steer your business in the right direction.


Continue Reading

Big Data

Chatbots and Intelligent Automation Solutions Paving the Way towards Seamless Business Continuity




Frequent business disruptions in the form of storms, pandemics, lockdowns, etc., pose risk to seamless operations and revenue generation in service industries. One day of operation disruption leads to losses worth millions. Semi-automation is not able to stop the cascading business effects of an unprecedented business disruption. Services such as banks, financial services, insurance, healthcare, information technology services, etc., cannot afford the risk of downtime. Chatbots powered by Intelligent Automation is that indispensable solution in the omni-channel customer interface that keeps the business moving 24×7 even in the face of a major business disruption such as long prevailing pandemic.

How do Intelligent Automation powered Chat-bots offer seamless business continuity?

Chatbots engage diverse skill sets such as Robotic Process Automation (RPA) and Artificial Intelligence (AI) / Machine Learning (ML), in short Intelligent Automation, and offer a lifeline to the service industry businesses. Chatbots are located on the key pages of a business website or social media pages of the business, and can be accessed by customers and prospects round the clock in different international languages. They augment the services of the regular service desk and helps tide over most emergency situations.

Chatbots can handle complex queries and the functioning depends on training data set and the streamlined data in the CRM database. All chatbot interactions can be further cleaned and stored in the CRM and analysed. Based on these interactions at different stages of the customer journey, the chatbots can make intelligent suggestions to the customer during the subsequent customer interaction.

The chatbots offer tremendous business benefits. The responses are highly accurate and relevant and have a minuscule turnaround time. The on time responses right from order booking to bill payment while taking care of customer preference ensures high productivity and thereby generates high revenue even when a business executive is not able to interact directly with a customer.

In conclusion:

Chatbot solution powered by Intelligent Automation is that indispensable tool in the omni-channel customer service desk of a service industry business. It helps to keep the business up and running even when customer executives are not able to interact directly with the customer due to unprecedented business disruptions. Chatbot solutions thereby enable businesses to stay up and functioning at all times in a 24x7x365 scenario.

Image Credit:


Continue Reading

Big Data

How Hazelcast hopes to make digital transformation mainstream




Commentary: Even as the coronavirus pandemic has hastened digital transformation efforts, success remains elusive for many companies. This one-stop shop to digital transformation might help.

Digital transformation

Image: metamorworks, Getty Images/iStockphoto

It’s no secret that, as CircleCI CEO Jim Rose put it, “The pandemic has compressed the time[line]” for digital transformation. What is perhaps surprising is just how broad and deep that transformation is spreading. In an interview with Hazelcast chief product officer David Brimley, he stressed that while Fortune 500 e-commerce and finance companies have historically paid the bills for Hazelcast, provider of an open source in-memory data grid (IMDG), mid-sized enterprises “are coming to us and saying, we want to start digitizing and [adding digital] channels for our business.”

How they get there, and how fast, is the question. 

SEE: Digital transformation road map (free PDF) (TechRepublic)

A one-stop shop to digital transformation 

As keen as companies are to move workloads to the cloud to facilitate digital transformation, not all companies are alike in their readiness, Brimley said. In particular, these mid-sized enterprises may lack the personnel or other resources to push aggressively into the cloud, whatever their intentions. As such, he said, many companies are trying to figure out “the quickest way I can get the applications and hardware I’ve got today in my own data centers and add a digital channel on the top of it as quickly as I can.” 

No PhD in Computer Geekery required.

SEE: Special report: Prepare for serverless computing (free PDF) (TechRepublic)

By pairing Hazelcast IMDG for distributed coordination and in-memory data storage with Hazelcast Jet for building streaming data pipelines, Brimley said, organizations can build digital integration hubs without having the technical chops of a Netflix or Facebook. “There are a lot of companies that can’t make head nor tail of this plethora of Cloud Native Computing Foundation products [Kubernetes, Envy, Fluentd, etc.], and they just want to stand up a Java process, have it clustered together, have some way of running their ‘microservices’ on this Java cluster, and off they go.”

Once, a company (and open source project) like Hazelcast would have had to pitch themselves to banks and credit card companies for low-latency, high-performance distributed systems; these were the types of organizations that valued IMDGs. Today, however, such concerns span a much broader range of companies, particularly with this crushing need to achieve digital transformation.  

For Brimley and Hazelcast, they’re not pitching themselves as a database or any particular technology. Even the IMDG label might not fit particularly well. After all, the company isn’t positioning itself as about technology, per se, but rather about solving business problems; about how developers can use Hazelcast to capture “interesting new architectural patterns,” in Brimley’s words. It’s taking on the “I need to embrace an event-driven architecture crowd,” and not selling a data cache or, yes, not even an in-memory data grid.

Disclosure: I work for AWS, but these are my views and don’t reflect those of my employer.

Also see


Continue Reading
Energy4 hours ago

Copper Foil Market Size Worth $10.3 Billion By 2027 | CAGR: 9.7%: Grand View Research, Inc.

Energy4 hours ago

Corundum Market Size Worth $3.5 Billion By 2027 | CAGR: 4.0%: Grand View Research, Inc.

AR/VR5 hours ago

Mozilla is Shuttering its XR Team Amidst Major Layoff, But ‘Hubs’ Will Continue

Energy5 hours ago

New Energy Challenger, Rebel Energy, Places Blue Prism Digital Workers at the Heart of its Launch Plans

Science5 hours ago

Teknosa grows by 580 percent in e-commerce and pulls its operating profit into positive territory in Q2, despite the pandemic

Science5 hours ago

Novo Ventures Portfolio Company F2G Closes US$60.8 Million Financing

Science5 hours ago

F2G Closes US$60.8 Million Financing to fund late stage development of novel mechanism antifungal agent

Publications6 hours ago

Putin’s plan for Russia’s coronavirus vaccine is at ‘high risk of backfiring,’ expert says

Publications6 hours ago

UK enters recession after GDP plunged by a record 20.4% in the second quarter

Gaming6 hours ago

Another Steam Game Festival Is Coming In October

Science6 hours ago

Top 25 Nationally Ranked Carr, Riggs & Ingram (CRI) Welcomes Cookeville-Based Firm, Duncan, Wheeler & Wilkerson, P.C.

Science6 hours ago

Avast plc Half Year Results For The Six-Months Ended 30 June 2020

Cyber Security6 hours ago

Russian hackers steal Prince Harry and Meghan Markle photos via Cyber Attack

Gaming6 hours ago

Oddworld: New ‘N Tasty Coming To Switch In October

Gaming6 hours ago

Linkin Park’s Mike Shinoda Is Writing A Song For Gamescom 2020

Cyber Security6 hours ago

Texas School District experiences DDoS Cyber Attack

Gaming6 hours ago

‘EVE: Echoes’ from CCP Games and Netease Is Now Available Early on the App Store, Servers Go Live Tomorrow

Gaming6 hours ago

Hans Zimmer Created An Extended Netflix “Ta Dum” Sound For Theatres

Cannabis6 hours ago

Everything you need to know about the Exxus Snap VV

Private Equity7 hours ago

How Valence Aims to Provide Better Access and Funding for Black Founders & Executives

Gaming7 hours ago

Netflix Is Getting An R. L. Stine Fear Street Film Trilogy In 2021

Blockchain7 hours ago

Venezuela May Soon Be Collecting Taxes in Crypto

Gaming7 hours ago

The Last of us Part 2 gets Grounded, permadeath, and unlockable modifiers in latest update

Payments7 hours ago

A bankers guide to AI Part 3. Does the AI have more than one purpose? What is the roadmap?

Blockchain7 hours ago

Animoca Targets Non-Crypto Gamers by Developing Big Brand Games

Blockchain7 hours ago

Flipstarter Campaign Launched to Create Viral BCH Videos for Normies

Publications7 hours ago

Stock futures slightly higher after S&P 500 snaps seven-day winning streak

Blockchain8 hours ago

CoinShares: ‘Bitcoin Is Like a Risky Tech Stock’’

Cyber Security8 hours ago

Digital signatures security explained

Cyber Security8 hours ago

How to secure Syslog with USM Anywhere

Cyber Security8 hours ago

Turning ocean plastic waste into eco-friendly bank cards

Esports8 hours ago

Tfue finally tastes victory in Fall Guys after 7 hours of playtime

Payments8 hours ago

Stablecoin News for the week ending Tuesday 11th August

Energy8 hours ago

Prosperity Capital Management kommentiert die Ergebnisse der außerordentlichen Hauptversammlung von Petropavlovsk

Blockchain8 hours ago

How Not To Lose Everything During the Bull Run

Gaming8 hours ago

Xbox Series X planned for launch in November

Blockchain8 hours ago

Surging Interest in ‘Yam’ Yield Farming — But Is It Too risky?

Gaming9 hours ago

Final Fantasy 14 Patch 5.3 brings Shadowbringers story to a close

Gaming9 hours ago

Shortest and fastest Adventures – Destiny 2

Blockchain10 hours ago

NEAR Token Sale Postponed After CoinList Is Overwhelmed by Demand