Connect with us


React with TypeScript: Best Practices




React and TypeScript are two awesome technologies used by a lot of developers these days. Knowing how to do things can get tricky, and sometimes it’s hard to find the right answer. Not to worry. We’ve put together the best practices along with examples to clarify any doubts you may have.

Let’s dive in!

How React and TypeScript Work Together

Before we begin, let’s revisit how React and TypeScript work together. React is a “JavaScript library for building user interfaces”, while TypeScript is a “typed superset of JavaScript that compiles to plain JavaScript.” By using them together, we essentially build our UIs using a typed version of JavaScript.

The reason you might use them together would be to get the benefits of a statically typed language (TypeScript) for your UI. This means more safety and fewer bugs shipping to the front end.

Does TypeScript Compile My React Code?

A common question that’s always good to review is whether TypeScript compiles your React code. The way TypeScript works is similar to this interaction:

TS: “Hey, is this all your UI code?”
React: “Yup!”
TS: “Cool! I’m going to compile it and make sure you didn’t miss anything.”
React: “Sounds good to me!”

So the answer is yes, it does! But later, when we cover the tsconfig.json settings, most of the time you’ll want to use "noEmit": true. What this means is TypeScript will not emit JavaScript out after compilation. This is because typically, we’re just utilizing TypeScript to do our type-checking.

The output is handled, in a CRA setting, by react-scripts. We run yarn build and react-scripts bundles the output for production.

To recap, TypeScript compiles your React code to type-check your code. It doesn’t emit any JavaScript output (in most scenarios). The output is still similar to a non-TypeScript React project.

Can TypeScript Work with React and webpack?

Yes, TypeScript can work with React and webpack. Lucky for you, the official TypeScript Handbook has a guide on that.

Hopefully, that gives you a gentle refresher on how the two work together. Now, on to best practices!

Best Practices

We’ve researched the most common questions and put together this handy list of the most common use cases for React with TypeScript. This way, you can follow best practices in your projects by using this article as a reference.


One of the least fun, yet most important parts of development is configuration. How can we set things up in the shortest amount of time that will provide maximum efficiency and productivity? We’ll discuss project setup including:

  • tsconfig.json
  • ESLint
  • Prettier
  • VS Code extensions and settings.

Project Setup

The quickest way to start a React/TypeScript app is by using create-react-app with the TypeScript template. You can do this by running:

npx create-react-app my-app --template typescript

This will get you the bare minimum to start writing React with TypeScript. A few noticeable differences are:

  • the .tsx file extension
  • the tsconfig.json
  • the react-app-env.d.ts

The tsx is for “TypeScript JSX”. The tsconfig.json is the TypeScript configuration file, which has some defaults set. The react-app-env.d.ts references the types of react-scripts, and helps with things like allowing for SVG imports.


Lucky for us, the latest React/TypeScript template generates tsconfig.json for us. However, they add the bare minimum to get started. We suggest you modify yours to match the one below. We’ve added comments to explain the purpose of each option as well:

{ "compilerOptions": { "target": "es5", // Specify ECMAScript target version "lib": [ "dom", "dom.iterable", "esnext" ], // List of library files to be included in the compilation "allowJs": true, // Allow JavaScript files to be compiled "skipLibCheck": true, // Skip type checking of all declaration files "esModuleInterop": true, // Disables namespace imports (import * as fs from "fs") and enables CJS/AMD/UMD style imports (import fs from "fs") "allowSyntheticDefaultImports": true, // Allow default imports from modules with no default export "strict": true, // Enable all strict type checking options "forceConsistentCasingInFileNames": true, // Disallow inconsistently-cased references to the same file. "module": "esnext", // Specify module code generation "moduleResolution": "node", // Resolve modules using Node.js style "resolveJsonModule": true, // Include modules imported with .json extension "noEmit": true, // Do not emit output (meaning do not compile code, only perform type checking) "jsx": "react" // Support JSX in .tsx files "sourceMap": true, // Generate corrresponding .map file "declaration": true, // Generate corresponding .d.ts file "noUnusedLocals": true, // Report errors on unused locals "noUnusedParameters": true, // Report errors on unused parameters "experimentalDecorators": true, // Enables experimental support for ES decorators "incremental": true, // Enable incremental compilation by reading/writing information from prior compilations to a file on disk "noFallthroughCasesInSwitch": true // Report errors for fallthrough cases in switch statement }, "include": [ "src/**/*" // *** The files TypeScript should type check *** ], "exclude": ["node_modules", "build"] // *** The files to not type check ***

The additional recommendations come from the [react-typescript-cheatsheet community and the explanations come from the Compiler Options docs in the Official TypeScript Handbook. This is a wonderful resource if you want to learn about other options and what they do.


In order to ensure that your code follows the rules of the project or your team, and the style is consistent, it’s recommended you set up ESLint and Prettier. To get them to play nicely, follow these steps to set it up.

  1. Install the required dev dependencies:

    yarn add eslint @typescript-eslint/parser @typescript-eslint/eslint-plugin eslint-plugin-react --dev
  2. Create a .eslintrc.js file at the root and add the following:

    module.exports = { parser: '@typescript-eslint/parser', // Specifies the ESLint parser extends: [ 'plugin:react/recommended', // Uses the recommended rules from @eslint-plugin-react 'plugin:@typescript-eslint/recommended', // Uses the recommended rules from @typescript-eslint/eslint-plugin ], parserOptions: { ecmaVersion: 2018, // Allows for the parsing of modern ECMAScript features sourceType: 'module', // Allows for the use of imports ecmaFeatures: { jsx: true, // Allows for the parsing of JSX }, }, rules: { // Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs // e.g. "@typescript-eslint/explicit-function-return-type": "off", }, settings: { react: { version: 'detect', // Tells eslint-plugin-react to automatically detect the version of React to use }, },
  3. Add Prettier dependencies:

    yarn add prettier eslint-config-prettier eslint-plugin-prettier --dev
  4. Create a .prettierrc.js file at the root and add the following:

    module.exports = { semi: true, trailingComma: 'all', singleQuote: true, printWidth: 120, tabWidth: 4,
  5. Update the .eslintrc.js file:

    module.exports = { parser: '@typescript-eslint/parser', // Specifies the ESLint parser extends: [ 'plugin:react/recommended', // Uses the recommended rules from @eslint-plugin-react 'plugin:@typescript-eslint/recommended', // Uses the recommended rules from the @typescript-eslint/eslint-plugin
    + 'prettier/@typescript-eslint', // Uses eslint-config-prettier to disable ESLint rules from @typescript-eslint/eslint-plugin that would conflict with prettier
    + 'plugin:prettier/recommended', // Enables eslint-plugin-prettier and displays prettier errors as ESLint errors. Make sure this is always the last configuration in the extends array. ], parserOptions: { ecmaVersion: 2018, // Allows for the parsing of modern ECMAScript features sourceType: 'module', // Allows for the use of imports ecmaFeatures: { jsx: true, // Allows for the parsing of JSX }, }, rules: { // Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs // e.g. "@typescript-eslint/explicit-function-return-type": "off", }, settings: { react: { version: 'detect', // Tells eslint-plugin-react to automatically detect the version of React to use }, },

These recommendations come from a community resource written called “Using ESLint and Prettier in a TypeScript Project” by Robert Cooper. If you visit his blog, you can read more about the “why” behind these rules and configurations.

VSCode Extensions and Settings

We’ve added ESLint and Prettier and the next step to improve our DX is to automatically fix/prettify our code on save.

First, install the ESLint extension and the Prettier extension for VSCode. This will allow ESLint to integrate with your editor seamlessly.

Next, update your Workspace settings by adding the following to your .vscode/settings.json:

{ "editor.formatOnSave": true

This will allow VS Code to work its magic and fix your code when you save. It’s beautiful!

These suggestions also come from the previously linked article “Using ESLint and Prettier in a TypeScript Project” by Robert Cooper.


One of the core concepts of React is components. Here, we’ll be referring to standard components as of React v16.8, meaning ones that use hooks as opposed to classes.

In general, there’s much to be concerned with for basic components. Let’s look at an example:

import React from 'react' // Written as a function declaration
function Heading(): React.ReactNode { return <h1>My Website Heading</h1>
} // Written as a function expression
const OtherHeading: React.FC = () => <h1>My Website Heading</h1>

Notice the key difference here. In the first example, we’re writing our function as a function declaration. We annotate the return type with React.Node because that’s what it returns. In contrast, the second example uses a function expression. Because the second instance returns a function, instead of a value or expression, we annotate the function type with React.FC for React “Function Component”.

It can be confusing to remember the two. It’s mostly a matter of design choice. Whichever you choose to use in your project, use it consistently.


The next core concept we’ll cover is props. You can define your props using either an interface or a type. Let’s look at another example:

import React from 'react' interface Props { name: string; color: string;
} type OtherProps = { name: string; color: string;
} // Notice here we're using the function declaration with the interface Props
function Heading({ name, color }: Props): React.ReactNode { return <h1>My Website Heading</h1>
} // Notice here we're using the function expression with the type OtherProps
const OtherHeading: React.FC<OtherProps> = ({ name, color }) => <h1>My Website Heading</h1>

When it comes to types or interfaces, we suggest following the guidelines presented by the react-typescript-cheatsheet community:

  • “always use interface for public API’s definition when authoring a library or 3rd-party ambient type definitions.”
  • “consider using type for your React Component Props and State, because it is more constrained.”

You can read more about the discussion and see a handy table comparing types vs interfaces here.

Let’s look at one more example so we can see something a little bit more practical:

import React from 'react' type Props = { /** color to use for the background */ color?: string; /** standard children prop: accepts any valid React Node */ children: React.ReactNode; /** callback function passed to the onClick handler*/ onClick: () => void;
} const Button: React.FC<Props> = ({ children, color = 'tomato', onClick }) => { return <button style={{ backgroundColor: color }} onClick={onClick}>{children}</button>

In this <Button /> component, we use a type for our props. Each prop has a short description listed above it to provide more context to other developers. The ? after the prop named color indicates that it’s optional. The children prop takes a React.ReactNode because it accepts everything that’s a valid return value of a component (read more here). To account for our optional color prop, we use a default value when destructuring it. This example should cover the basics and show you have to write types for your props and use both optional and default values.

In general, keep these things in mind when writing your props in a React and TypeScript project:

  • Always add descriptive comments to your props using the TSDoc notation /** comment */.
  • Whether you use types or interfaces for your component props, use them consistently.
  • When props are optional, handle appropriately or use default values.


Luckily, the TypeScript type inference works well when using hooks. This means you don’t have much to worry about. For instance, take this example:

// `value` is inferred as a string
// `setValue` is inferred as (newValue: string) => void
const [value, setValue] = useState('')

TypeScript infers the values given to use by the useState hook. This is an area where React and TypeScript just work together and it’s beautiful.

On the rare occasions where you need to initialize a hook with a null-ish value, you can make use of a generic and pass a union to correctly type your hook. See this instance:

type User = { email: string; id: string;
} // the generic is the < >
// the union is the User | null
// together, TypeScript knows, "Ah, user can be User or null".
const [user, setUser] = useState<User | null>(null);

The other place where TypeScript shines with Hooks is with userReducer, where you can take advantage of discriminated unions. Here’s a useful example:

type AppState = {};
type Action = | { type: "SET_ONE"; payload: string } | { type: "SET_TWO"; payload: number }; export function reducer(state: AppState, action: Action): AppState { switch (action.type) { case "SET_ONE": return { ...state, one: action.payload // `payload` is string }; case "SET_TWO": return { ...state, two: action.payload // `payload` is number }; default: return state; }

Source: react-typescript-cheatsheet Hooks section

The beauty here lies in the usefulness of discriminated unions. Notice how Action has a union of two similar-looking objects. The property type is a string literal. The difference between this and a type string is that the value must match the literal string defined in the type. This means your program is extra safe because a developer can only call an action that has a type key set to "SET_ONE" or "SET_TWO".

As you can see, Hooks don’t add much complexity to the nature of a React and TypeScript project. If anything, they lend themselves well to the duo.

Common Use Cases

This section is to cover the most common use cases where people stumble when using TypeScript with React. We hope by sharing this, you’ll avoid the pitfalls and even share this knowledge with others.

Handling Form Events

One of the most common cases is correctly typing the onChange used on an input field in a form. Here’s an example:

import React from 'react' const MyInput = () => { const [value, setValue] = React.useState('') // The event type is a "ChangeEvent" // We pass in "HTMLInputElement" to the input function onChange(e: React.ChangeEvent<HTMLInputElement>) { setValue( } return <input value={value} onChange={onChange} id="input-example"/>

Extending Component Props

Sometimes you want to take component props declared for one component and extend them to use them on another component. But you might want to modify one or two. Well, remember how we looked at the two ways to type component props, types or interfaces? Depending on which you used determines how you extend the component props. Let’s first look at the way using type:

import React from 'react'; type ButtonProps = { /** the background color of the button */ color: string; /** the text to show inside the button */ text: string;
} type ContainerProps = ButtonProps & { /** the height of the container (value used with 'px') */ height: number;
} const Container: React.FC<ContainerProps> = ({ color, height, width, text }) => { return <div style={{ backgroundColor: color, height: `${height}px` }}>{text}</div>

If you declared your props using an interface, then we can use the keyword extends to essentially “extend” that interface but make a modification or two:

import React from 'react'; interface ButtonProps { /** the background color of the button */ color: string; /** the text to show inside the button */ text: string;
} interface ContainerProps extends ButtonProps { /** the height of the container (value used with 'px') */ height: number;
} const Container: React.FC<ContainerProps> = ({ color, height, width, text }) => { return <div style={{ backgroundColor: color, height: `${height}px` }}>{text}</div>

Both methods solve the problem. It’s up to you to decide which to use. Personally, extending an interface feels more readable, but ultimately, it’s up to you and your team.

You can read more about both concepts in the TypeScript Handbook:

Third-party Libraries

Whether it’s for a GraphQL client like Apollo or for testing with something like React Testing Library, we often find ourselves using third-party libraries in React and TypeScript projects. When this happens, the first thing you want to do is see if there’s a @types package with the TypeScript type definitions. You can do so by running:

yarn add @types/<package-name> #npm
npm install @types/<package-name>

For instance, if you’re using Jest, you can do this by running:

yarn add @types/jest #npm
npm install @types/jest

This would then give you added type-safety whenever you’re using Jest in your project.

The @types namespace is reserved for package type definitions. They live in a repository called DefinitelyTyped, which is partially maintained by the TypeScript team and partially the community.

Should these be saved as dependencies or devDependencies in my package.json?

The short answer is “it depends”. Most of the time, they can go under devDependencies if you’re building a web application. However, if you’re writing a React library in TypeScript, you may want to include them as dependencies.

There are a few answers to this on Stack Overflow, which you may check out for further information.

What happens if they don’t have a @types package?

If you don’t find a @types package on npm, then you essentially have two options:

  1. Add a basic declaration file
  2. Add a thorough declaration file

The first option means you create a file based on the package name and put it at the root. If, for instance, we needed types for our package banana-js, then we could create a basic declaration file called banana-js.d.ts at the root:

declare module 'banana-js';

This won’t provide you type safety but it will unblock you.

A more thorough declaration file would be where you add types for the library/package:

declare namespace bananaJs { function getBanana(): string; function addBanana(n: number) void; function removeBanana(n: number) void;

If you’ve never written a declaration file, then we suggest you take a look at the guide in the official TypeScript Handbook.


Using React and TypeScript together in the best way takes a bit of learning due to the amount of information, but the benefits pay off immensely in the long run. In this article, we covered configuration, components, props, hooks, common use cases, and third-party libraries. Although we could dive deeper into a lot of individual areas, this should cover the 80% needed to help you follow best practices.

If you’d like to see this in action, you can see this example on GitHub.

If you’d like to get in touch, share other best practices or chat about using the two technologies together, you can reach me on Twitter @jsjoeio.

Further Reading

If you’d like to dive deeper, here are some resources we suggest:


A lot of these recommendations came straight from the react-typescript-cheatsheet. If you’re looking for specific examples or details on anything React-TypeScript, this is the place to go. We welcome contributions as well!

Official TypeScript Handbook

Another fantastic resource is the TypeScript Handbook. This is kept up to date by the TypeScript team and provides examples and an in-depth explanation behind the inner workings of the language.

TypeScript Playground

Did you know you can test out React with TypeScript code right in the browser? All you have to do is import React. Here’s a link to get you started.

Practical Ways to Advance Your TypeScript Skills

Read our guide on practical ways to advance your TypeScript skills to set yourself up for continuous learning as you move forward.



How To Manage A Technical Debt Properly




Alex Omeyer Hacker Noon profile picture

@alex-omeyerAlex Omeyer

Co-founder & CEO at, SaaS to measure & manage technical debt

We’re used to thinking that you cannot deliver fast and maintain a healthy codebase. But does it really has to be a trade-off?

One of my greatest privileges building Stepsize has been hearing from hundreds of the best engineering teams in the world about how they ship software at pace while maintaining a healthy codebase.

That’s right, these teams go faster because they manage technical debt properly. We’re so used to the quality vs. cost trade-off that this statement sounds like a lie—you can’t both be fast and maintain a healthy codebase.

Martin Fowler does a great job at debunking this idea in his piece ‘Is high quality software worth the cost?‘. Spoiler:

High quality software is actually cheaper to produce.

The lessons I’ll relay in this article are drawn from the combined centuries of experience of the these 300+ software engineers I’ve interviewed.

Why bother?

As Adam Tornhill and I recently discussed in our webinar, software has well and truly eaten the world. And look, if you’re here, this will probably sound like a cliché to you. In this case, it’s because it’s true. Look around you, can you name one object that didn’t need some form of software intervention to be manufactured, purchased, or delivered to you?

Software companies live and die by the quality of their software, and the speed at which they deliver it.

Stripe found that ‘engineers spend 33% of their time dealing with technical debt’. Gartner found that companies who manage technical debt ship 50% than those who don’t. These data points may seem a little dry, but we intuitively know they’re true. How many times have we estimated a feature will be delivered in a sprint, only for it to take two? Now take a moment to extrapolate and think about the impact this will have on your company over a year, two, or its entire lifespan.

Is it not clear that companies who manage technical debt properly simply win?

A simple framework to achieve these results

Google around for ‘types of technical debt’, and you’ll find hordes of articles by authors geeking out about code debt, design debt, architecture debt, process debt, infrastructure debt—this debt that debt.

These articles are helpful in that they can train you to recognise technical debt when you come across it, but they won’t help you decide how to deal with each piece of debt, let alone how to manage tech debt as a company.

The only thing that matters is whether you’re dealing with a small, medium, or large piece of debt.

The process for small pieces of debt

This is the type of tech debt that can be handled as soon as the engineer spots it in the code—a quick refactoring or variable rename. Engineers don’t need anyone’s approval to do this, or to create a ticket for it to be prioritised. It is simply part of their jobs to apply the boyscout rule coined by Uncle Bob:

Always leave the code better than you found it.

This is table stakes at every software company who have their tech debt under control that I’ve interviewed. It’s mostly driven by Engineering culture, gets enforced in PRs or with linters, and it is understood that it is every individual contributor’s responsibility to handle small pieces of debt when they come across them.

The process for medium-sized debt

The top performers I’ve interviewed stress the importance of addressing technical debt continuously as opposed to tackling it in big projects.

Paying off technical debt is a process, not a project.

You do not want to end up in a situation where you need to stop all feature development to rewrite your entire application every three to five years.

This is why these teams dedicate 10-30% of every sprint to maintenance work that tackles technical debt. I call the tech debt that is surfaced and addressed as part of this process medium-sized debt.

To determine what proportion of your sprint to allocate to tech debt, simply find the overlap between the parts of your codebase you’ll modify with your feature work, and the parts of your codebase where your worse tech debt lives. You can then scope out the tech debt work and allocate resources accordingly. Some teams even increase the scope of their feature work to include the relevant tech debt clean up. More in this article ‘How to stop wasting time on tech debt‘.

For this to work, individual contributors need to track medium-sized debt whenever they come across it. It is then the Team Lead’s responsibility to prioritise this list of tech debt, and to discuss it with the Product Manager prior to sprint planning so that engineering resources can be allocate effectively.

The process for large pieces of debt

Every once in a while, your team will realise that some of the medium-sized debt they came across is actually due to a much larger piece of debt. For example, they may realise that the reason the frontend code is under-performing is because they should be using a different framework for the job.

Left unattended, these large pieces of debt can cause huge problems, and—like all tech debt—get much worse as time goes by.

The best companies I’ve interviewed have monthly or quarterly technical planning sessions in which all engineering and product leaders participate. Depending on the size of the company, Staff Engineer, Principal Engineers, and/or Engineering Managers are responsible for putting together technical proposals outlining the problem, solution, and business case for each of these large pieces of debt. These then get reviewed by engineering and product leadership and the ones that get prioritised are added to the roadmap.

How to achieve this easier

In order to be able to run this process, you need to have visibility into your tech debt. A lot of companies I’ve spoken to try to achieve this by creating a tech debt backlog in their project management tool or in a spreadsheet.

It’s a great way to start, but here’s the problem: these issues will not contain the context necessary for you to prioritise them effectively. Not only do you need to rank each tech debt issue against all others, you also need to convincingly argue that fixing this tech debt is more important than using these same engineering resources towards shipping a new feature instead.

Here’s the vicious cycle that ensues: the team tracks debt, you can’t prioritise it, so you can’t fix it, the backlog grows, it’s even harder to prioritise and make sense of it, you’re still not effectively tackling your debt, so the team stops tracking it. You no longer have visibility into your debt, still can’t prioritise it, and it was all for nothing.

We built Stepsize to solve this exact problem. With our product, engineers can track debt directly from their workflow (code editor, pull request, Slack, and more) so that you can have visibility into your debt. Stepsize automatically picks up important context like the code the debt relates to, and engineers get to quantify the impact the debt is having on the business and the risks it presents (e.g. time lost, customer risk, and more) so that you can prioritise it easily.

You can join the best software companies by adopting this process, start here.

Previously published at


Join Hacker Noon

Create your free account to unlock your custom reading experience.

Coinsmart. Beste Bitcoin-Börse in Europa

Continue Reading


What is key to stirring a Litecoin comeback on the charts?




[PRESS RELEASE – Please Read Disclaimer]

Zug, Switzerland, 9th March, 2021, // ChainWire // Privacy-centric blockchain Concordium has finalized its MVP testnet and concluded a private sale of tokens to fund further development. The company secured $15M in additional funding for the Public and permissionless compliance-ready privacy-centric blockchain.

Late February Concordium announced joint venture cooperation between Concordium and Geely Group, a Fortune 500 company and automotive technology firm. The partnership will focus on building blockchain-based services on Concordium’s enterprise-focused chain.

Concordium recently completed Testnet 4, which saw over 2,300 self-sovereign identities issued and over 7,000 accounts created, with more than 1,000 active nodes, 800 bakers, and over 3,600 wallet downloads. The successful testnet led to the release of Concordium smart contracts functionality based on RustLang, with a select group of community members participating in stress-testing the network. Test deployments for smart contracts included gaming, crowdfunding, time-stamping, and voting.

Concordium CEO Lone Fonss Schroder said: “The interest of the community, from RustLang developers, VCs, system integrators, family offices, crypto service providers, and private persons, has been amazing. Concordium has fielded strong demand from DeFi projects looking to build on a blockchain with ID at the protocol level.”

Concordium will bring its blockchain technology for broad use, which also appeals to enterprises with protocol-level ID protected by zero-knowledge proofs and stable transaction costs to support predictable, fast, and secure transactions. Its core scientific team is made up of renowned researchers Dr. Torben Pedersen, creator of the Pedersen commitment, and Prof. Ivan Damgård, father of the Merkel-Damgård Construct.

Concordium, which is on course for a mainnet launch in Q2, aims to solve the long-standing blockchain-for-enterprise problem by addressing it in a novel way with a unique software stack based on peer-reviewed and demonstrated advanced identity and privacy technologies providing speed, security and counterpart transparency.

The Concordium team intends to announce its post-mainnet roadmap in the coming days.

About Concordium

Concordium is a next-generation, broad-focused, decentralized blockchain and the first to introduce built-in ID at the protocol level. Concordium’s core features solve the shortcomings of classic blockchains by allowing identity management at the protocol level and zero-knowledge proofs, which are used to replace anonymity with perfect privacy. The technology supports encrypted payments with software that upholds future regulatory compliance demands for transactions made on the blockchain. Concordium employs a team of dedicated cryptographers and business experts to further its vision. Protocols are science-proofed by peer reviews and developed in cooperation with Concordium Blockchain Research Center Aarhus, Aarhus University, and other global leading universities, such as ETH Zürich, a world-leading computer science university, and the Indian Institute of Science.

Binance Futures 50 USDT FREE Voucher: Use this link to register & get 10% off fees and 50 USDT when trading 500 USDT (limited offer).

PrimeXBT Special Offer: Use this link to register & enter CRYPTOPOTATO35 code to get 35% free bonus on any deposit up to 1 BTC.

Checkout PrimeXBT
Trade with the Official CFD Partners of AC Milan

Continue Reading


Bowling in VR!




The bowling ball: By pressing the trigger on the controller, the user can pick up, hold, and release the ball. The weight and speed of the ball mimics the movement that a regular bowling ball would have. After it is thrown, the ball will respawn in the starting position by hitting the backstop, or through a manual reset from the user pressing the “reset ball” button.

The pins: Once hit by bowling ball, the pins will fall down after the collision. Once all of the pins are hit, they will respawn to reset the game. The pins can also be manually be reset through the button as seen in the image above on the right hand side. The design of the both the ball and pins were already pre-created in the asset store which are available for free.

The asset store is your friend!

The lane: With a wood flooring, the lane has two walls on either side with a backstop, simulating bumpers that would be regularly seen bowling.

1. The Possibility of AR on the Urban Space

2. Enter The Monarchy

3. The Exciting Applications of AR and VR in Automotive

4. The Best VR Events and Concerts Planned for 2021

Here’s a look into how the game actually works!

The Physics:

These are the components added on the bowling ball for functionality in collision and interactivity. The rigid body and mesh collider are also applied to the bowling pins, with the only change in mass (the pins are 1, while the ball is 3), and the OVR Grabbable script.

Rigid body: This makes sure the laws of physics and gravity are applied upon the game objects, and lets you apply forces and control it in a realistic way.

Mesh Collider: The checkmark on “Convex” indicates that this mesh collider object will collide with other mesh collider objects so that they don’t fall through the floor!

OVR Grabbable: From the free oculus integration in the asset store, the ovr grabbable script was already pre-made, allowing user interactiveness.

Pin and Ball Reset:

To reset both the pins and the ball, the user can click on these floating buttons in order to do so. I followed this tutorial (step 4 & 5) to add the buttons, but the most important step is integrating the Tilia package along with all the prefabs (fully configured game objects that have already been created for your use).

Installing the Tilia package into Unity: navigate to the manifest.json file in finder (go to the actual folder in your computer). After opening it up, there will be a section that says “dependencies”. At the bottom of this section, add in “io.extendreality.tilia.input.unityinputmanager”: “1.3.16”. The code below is a shortened version of what the dependencies would look like, and all the tilia extensions I added to complete this project.

 "dependencies": {
"com.unity.xr.legacyinputhelpers": "2.1.7",
"": "3.2.17",
"com.unity.xr.oculus": "1.6.1",
"com.unity.xr.openvr.standalone": "2.0.5",
"com.unity.modules.xr": "1.0.0",

//Above are a few of the dependencies already installed, while the tilia extensions below were manually added

"io.extendreality.tilia.input.unityinputmanager": "1.3.16",
"io.extendreality.tilia.indicators.objectpointers.unity": "1.6.7",
"io.extendreality.tilia.camerarigs.trackedalias.unity": "1.5.7",
"io.extendreality.tilia.camerarigs.unityxr": "1.4.9",
"io.extendreality.tilia.camerarigs.spatialsimulator.unity": "1.2.31",
"io.extendreality.tilia.interactions.interactables.unity": "1.15.7",
"io.extendreality.tilia.interactions.spatialbuttons.unity": "1.2.3"

For the code behind the reset, here is a look into the C# script for the pins:

The Awake() call is used to load certain set up necessary prior to the game scene.

The SavePositions() method is called, where all the starting positions of the pins are logged in an array.

The ResetPositions() method contains a for loop, and goes through each of the pins to set the position and rotation to the original value saved previously. The velocity is also flattened on the pins, in the case that it spun out after being knocked over.

Oculus Integration:

The Hierarchy!

Once again, the asset store comes in really handy! The free oculus integration is compiled of pre-made scripts and functions, such as the OVR Player Controller that includes the camera rig for the oculus and the controller visibility. To properly set it up with controller integration, I followed this tutorial that is also mentioned underneath the resources below. I had to turn on developer mode on the oculus app and connect my computer to the headset with the USB-C charging cable.

Some awesome resources that helped me out:

Checkout PrimeXBT
Trade with the Official CFD Partners of AC Milan

Continue Reading


India’s Crypto Ban Uncertain as Finance Minister Touts a Window for Experiments




Aave is a decentralized, open-source, non-custodial liquidity protocol that enables users to earn interest on cryptocurrency deposits, as well as borrow assets through smart contracts.

Aave is interesting (pardon the pun) because interest compounds immediately, rather than monthly or yearly. Returns are reflected by an increase in the number of AAVE tokens held by the lending party. 

Apart from helping to generate earnings, the protocol also offers flash loans. These are trustless, uncollateralized loans where borrowing and repayment occur in the same transaction. 

Assets on Aave as of 3/7/21 (source: aave homepage)

Assets on Aave as of 3/7/21 (source: aave homepage)

The following article explores Aave’s history, services, tokenomics, security, how the protocol works, and what users should be wary of when using the Aave platform.

How Does Aave Work?

The Aave protocol mints ERC-20 compliant tokens in a 1:1 ratio to the assets supplied by lenders. These tokens are known as aTokens and are interest-bearing in nature. These tokens are minted upon deposit and burned when redeemed. 

These aTokens, such as aDai, are pegged at a ratio of 1:1 to the value of the underlying asset – that is Dai in the case of aDai. 

The lending-borrowing mechanism of the Aave lending pool dictates that lenders will send their tokens to an Ethereum blockchain smart contract in exchange for these aTokens — assets that can be redeemed for the deposited token plus interest.  

atokens on Aave

atokens on Aave

Borrowers withdraw funds from the Aave liquidity pool by depositing the required collateral and, also, receive interest-bearing aTokens to represent the equivalent amount of the underlying asset.

Each liquidity pool, the liquidity market in the protocol where lenders deposit and borrowers withdraw from, has a predetermined loan-to-value ratio that determines how much the borrower can withdraw relative to their collateral. If the borrower’s position goes below the threshold LTV level, they face the risk of liquidation of their assets.

Humble Beginnings as ETHLend 

Aave was founded in May 2017 by Stani Kulechov as a decentralized peer-to-peer lending platform under the name ETHLend to create a transparent and open infrastructure for decentralized finance. ETHLend raised 16.5 million US dollars in its Initial Coin Offering (ICO) on November 25, 2017.

Kulechov, currently serving also as the CEO of Aave, has successfully led the company into the list of top 50 blockchain projects published by PWC. Aave is headquartered in London and backed by credible investors, such as Three Arrows Capital, Framework Ventures, ParaFi Capital, and DTC Capital.

ETHLend widened its bouquet of offerings and rebranded to Aave by September 2018. The Aave protocol was formally launched in January 2020, switching to the liquidity pool model from a Microstaking model.

To add context to this evolution from a Microstaking model to a Liquidity Pool model, Microstaking was where everyone using the ETHLend platform. Whether one is applying for a loan, funding a loan, or creating a loan offer, they had to purchase a ticket to obtain the rights to use the application, and that ticket had to be paid in the platform’s native token LEND. The ticket was previously a small amount pegged to USD, and the total number of LEND needed varied based on the token’s value. 

In the liquidity pool model, Lenders deposit funds to liquidity pools. Thus creating what’s known as a liquidity market, and borrowers can withdraw funds from the liquidity pools by providing collateral. In case the borrowers become undercollateralized, they face liquidation.

Aave raised another 4.5 million US dollars from an ICO and  3 million US dollars from Framework Ventures on July 8th and July 15th, 2020. 

Aave Pronunciation

Aave is typically pronounced “ah-veh.” 

Aave’s Products and Services

The Aave protocol is designed to help people lend and borrow cryptocurrency assets. Operating under a liquidity pool model, Aave allows lenders to deposit their digital assets into liquidity pools to a smart contract on the Ethereum blockchain. In exchange, they receive aTokens — assets that can be redeemed for the deposited token plus interest.

Aave's functionality

Borrowers can take out a loan by putting their cryptocurrency as collateral. The liquidity protocol of Aave, as per the latest available numbers, is more than 4.73 billion US dollars strong. 

Flash Loans

Aave’s Flash loans are a type of uncollateralized loan option, which is a unique feature even for the DeFi space. The Flash Loan product is primarily utilized by speculators seeking to take advantage of quick arbitrage opportunities. 

Borrowers can instantly borrow cryptocurrency for a matter of seconds; they must return the borrowed amount to the pool within one transaction block. If they fail to return the borrowed amount within the same transaction block, the entire transaction reverses and undo all actions executed until that point. 

Flash loans encourage a wide range of investment strategies that typically aren’t possible in such a short window of time. If used properly, a user could profit through arbitrage, collateral swapping, or self-liquidation.

Rate Switching

Aave allows borrowers to switch between fixed and floating rates, which is a fairly unique feature in DeFi. Interest rates in any DeFi lending and borrowing protocol are usually volatile, and this feature offers an alternative by providing an avenue of fixed stability. 

For example, if you’re borrowing money on Aave and expect interest rates to rise, you can switch your loan to a fixed rate to lock in your borrowing costs for the future. In contrast, if you expect rates to decrease, you can go back to floating to reduce your borrowing costs.

Aave Bug Bounty Campaign

Aave offers a bug bounty for cryptocurrency-savvy users. By submitting a bug to the Aave protocol, you can earn a reward of up to $250,000.

Aave Tokenomics

The maximum supply of the AAVE token is 16 million, and the current circulating supply is a little above 12.4 million AAVE tokens.

Initially, AAVE had 1.3 billion tokens in circulation. But in a July 2020 token swap, the protocol swapped the existing tokens for newly minted AAVE coins at a 1:100 ratio, resulting in the current 16 million supply. Three million of these tokens were kept in reserve allocated to the development fund for the core team. 

Aave’s price has been fairly volatile, with an all-time high of $559.12 on February 10, 2021. The lowest price was $25.97 on November 5th, 2020. 

Aave Security

Aave stores funds on a non-custodial smart contract on the Ethereum blockchain. As a non-custodial project, users maintain full control of their wallets. 

Aave governance token holders can stake their tokens in the safety module, which acts as a sort of decentralized insurance fund designed to ensure the protocol against any shortfall events such as contract exploits. In the module, the stakers can risk up to 30% of the funds they lock in the module and earn a fixed yield of 4.66%. 

The safety module has garnered $375 million in deposits, which is arguably the largest decentralized insurance fund of its kind. 

Final Thoughts: Why is Aave Important?

Aave is a DeFi protocol built on strong fundamentals and has forced other competitors in the DeFi space to bolster their value propositions to stay competitive. Features such as Flash loans and Rate switching offer a distinct utility to many of its users.

Aave emerged as one of the fastest-growing projects in the Summer 2020 DeFi craze. At the beginning of July 2020, the total value locked in the protocol was just above $115 million US dollars. In less than a year, on February 13, 2021, the protocol crossed the mark of 6 billion US dollars. The project currently allows borrowing and lending in 20 cryptocurrencies.

Aave is important because it shows how ripe the DeFi space is for disruption with new innovative features and how much room there is to grow.

Checkout PrimeXBT
Trade with the Official CFD Partners of AC Milan
The Easiest Way to Way To Trade Crypto.
Check out Nord
Make your Money Grow with Mintos

Checkout PrimeXBT
Trade with the Official CFD Partners of AC Milan
The Easiest Way to Way To Trade Crypto.

Continue Reading
Esports2 days ago

chessbae removed as moderator from amid drama

Esports4 days ago

Dota 2 Dawnbreaker Hero Guide

Esports1 day ago

DreamHack Online Open Ft. Fortnite April Edition – How To Register, Format, Dates, Prize Pool & More

Esports3 days ago

Why did Twitch ban the word “obese” from its predictions?

Esports4 days ago

A detailed look at Dawnbreaker, Dota 2’s first new carry in four years

Esports4 days ago

Dallas Empire escape with a win against Minnesota at the Stage 2 Major

Esports5 days ago

Dota 2 new hero: A list of possible suspects

Esports1 day ago

Hikaru Nakamura drops chessbae, apologizes for YouTube strike

Esports4 days ago

Dota 2: Patch 7.29 Analysis Of Top Changes

Esports4 days ago

Dota 2 patch 7.29: Impact of Outposts, Water Runes and other major general gameplay changes

Fintech2 days ago

Australia’s Peppermint Innovation signs agreement with the Philippine’s leading micro-financial services provider

Esports4 days ago

Dota 2: Team Nigma Completes Dota 2 Roster With iLTW

Blockchain5 days ago

Krypto-News Roundup 9. April

Esports5 days ago

Apex Legends tier list: the best legends to use in Season 8

Esports4 days ago

xQc calls ZULUL supporters racist for wanting Twitch emote back

Esports5 days ago

Mission Control, Tripleclix Team with Hollister for Fortnite Event/Product Launch

Esports4 days ago

Geely Holdings’ LYNK&CO Sponsors LNG Esports’ LPL Team

Esports3 days ago

Fortnite: Blatant Cheater Finishes Second In A Solo Cash Cup

Esports3 days ago

Hikaru Nakamura accused of striking Eric Hansen’s YouTube channel

Blockchain4 days ago

Revolut integriert 11 neue Kryptowährungen