Connect with us

Big Data

Important Documents Prepared By A Business Analyst

Published

on

This article was published as a part of the Data Science Blogathon

Preparing documents is one of the most critical tasks that every responsiblness analyst does. A Business Analyst not only documents the clients’ requirements but also happens to document the progress and every change that has occurred during the project lifecycle. It is vital to jot down the process flow and note these things for maintenance and future reference.

If you are new to this field or aspiring to become a successful business analyst, perhaps our guide about the critical and detailed documents prepared by business analysts will come in handy.

Without further ado, let’s dive right in and check what all documents prepared by business analysts are essential for becoming a successful BA.

Key Documents Needed to be Prepared by a Business Analyst:

documents by business analyst

In this list, we will be discussing the essential documents prepared by a BA right from the project initiation to project delivery to achieve the optimal business solution for the client:

  • Business Analysis Plan

  • Business Requirements Document

  • Functional requirement specification (FRS)/ Functional Specification Document (FSD)

  • System requirement specification (SRS)/ System Requirement Document (SRD)

  • Requirement traceability matrix (RTM)

  • Use Case Diagrams

  • Wireframes, Mockups

  • Change Request Document

Let’s discuss each of these vital documents in detail.

Project Vision Document

Even though the client or the project manager primarily creates the Project Vision Document, the role of the business analyst in developing this document is equally important. A project vision documents include the goal and the vision of the product to be developed and discusses what objective will be achieved through that particular product. It also consists of the following factors: benefits, risks involved, and options available before the project is initiated. These documents act as a formal agreement between the company and the business stakeholders.

A project vision document includes:

– Vision & Goal

– Description of users to be included in that particular project

– Stakeholders of the project

– Overview of Product to be developed

– Features of the product to be included while developing it

– Product requirements

– Constraints or Risks Involved

– Quality/documentation requirements

Business Analysis Plan

A Business Analysis Plan is primarily a formal document that describes the major activities that need to be carried out by a Business Analyst during the project lifecycle. A business analyst prepares this document during the planning phase of the project scope. The people involved in this phase are the project managers, product owners and the business managers whose support is needed to execute this plan.

The Business Analysis Plan involves the following steps:

  1. Purpose of the plan

  2. Roles & Responsibility Distribution

  3. Tools Required to Execute the Project Plan

  4. Process & Techniques implemented to define the project

  5. Workflow & Process Mapping

  6. Adaptability & Measures to Implement Changes if Required

Business Requirement Plan

business requirement document

Image 1

A business requirement document or BRD is created to define the requirements of the particular product or software that the client needs the company to work upon and finally achieve the results as discussed in the meetings with the client. A BRD is one such document that everyone refers to throughout the project life cycle. This document helps them make the right choices for every phase without making a rookie mistake during the process.

A BRD focuses on what would be the intended business solution based on the problem statement as presented by the client. Hence, it is safe to say that the BRD mentions and gathers all the essential requirements as explained by the business stakeholder.

Along with the Business Analyst, the project manager, the scrum master, the client, the domain experts, the senior managers create the BRD to ensure that the business requirements are correctly understood and jotted down for reference purposes.

The Business Requirement Document normally contains:

– A little background of the Project

– Goals and objectives of the Business

– Stakeholders involved in the project

– Requirement scope

– Gathering the relevant Data for the project

– Interface requirements for the projects to function well

– Business glossary/ Jargons (if required)

Functional Requirement Specifications (FRS)

Image 2

So BRDs are prepared in such a way that anyone can refer to the document for easy reference. No technical jargon is used, and the business requirements are put down in layman’s terms to avoid confusion. However, a BRD is not the ideal document for the technical development team to understand the system thoroughly. Hence, a Functional Specification document is there to meet the requirements of the development team.

While a BRD discusses what needs to be done or achieved in a particular project, FRS focuses on how the team should complete the project & how should the system behave to find the best path to complete the project within the stipulated time. Functional Specification Document also defines the intended behavior of a system, including data, operations, input, output, and the properties of the system. FRD is more insightful and intended for the development and testing team to fulfill their job in a particular project.

System Requirement Document (SRD) /SRS

SRD | documents by business analystImage 3

An SRS or SRD describes the complete behavior of the system and how the entire system has to function after getting developed. This document contains all the functions as well as the non-functional requirements. The System requirement specification (SRS)/ System Requirement Document (SRD) contains:

– Use Cases

– Type of Software Requirements to build the project

– Database & Storage Requirements for the product to function seamlessly

– Product Functions to understand the UI/UX

– User Characteristics to understand the target users for the intended software

WireFrame, Prototype, And MOCK-UP

wireframe | documents by business analyst

Image 4

It is essential to have a visual presentation of the company’s product that is about to be developed. This mockup, aka wireframe, eventually helps the client understand the future system, and as BA, it helps them know if the client agrees with their idea.

To visually present this blueprint, a business analyst prepares a wireframe design with the help of wireframe tools like JIRA. These diagrams can save plenty of time during the analysis phase as it aligns with the client’s requirements. The development team also takes reference from the wireframe design to build a practical design.

Use Case Diagram

Business Analysts must also prepare use case diagrams that help the team identify, organize and define the system requirements. Use cases illustrate a typical user’s interaction with the system developed by the team. It additionally helps in recording the scenarios where the user will be interacting with the system. It deals with user stories and many possibilities of how the interaction should be happening.

The development team refers to the use cases diagrams throughout the project lifecycle. The BA updates the use case diagrams when the team or the key stakeholders request any changes.

A normal Use Case diagram and a descriptive requirement document contain:

– Actors

– Description

– Trigger

– Preconditions

– Notes and Issues

These are some of the details that you will find in the use case diagram.

Requirement Traceability Matrix- RTM

An RTM, aka Requirement Traceability Matrix, is an important document prepared and managed by a business analyst. A Business Analyst uses RTM for mapping and tracking project requirements to the particular test cases and possible defects. RTM assures that all the functionality of the developed application is covered and tested as well.

An RTM is prepared in a tabular format in tools like excel, wherein the relationship between test scenarios and requirements gets established. RTM is also used to track any changes if implemented in the project.

The Business Requirement Document contains:

– Requirement Description

– Functional Requirement

– Technical Specification

– Software Module

– Tested In

Change Request Management

As a business analyst, you might come across many changes requested by the client or the development team during the project execution. The business owners might request some features to be added or deleted later on if they deem fit after certain speculations as per the market speculations. Any additional request that might delay or change the project’s course plan is considered a change request. Before implementing the change, the business analyst analyzes the timeline, the impact and then goes through the necessary approvals before adding them to the project.

Change Request Document is a great way to deal with scope creep, which occurs if the new proposed plan of action now differs from the initial purpose due to the addition or deletion of features. With regular meetings with the technical team and the client, it is then decided how the change request will be implemented and how the overall project plan will be affected.

Conclusion

As a business analyst, it is essential to document all the critical information and share it with the relevant team to complete the project with the decided timeframe. Additionally, a business analyst conducts constant meetings with the development team and the business stakeholder to track the project’s progress, accept change requests, and coordinate with the technical team to

Image Source-

  1. Image 1: example.com
  2. Image 2:
  3. Image 3: Freshcode
  4. Image 4 : pintrest

The media shown in this article on Interactive Dashboard using Bokeh are not owned by Analytics Vidhya and are used at the Author’s discretion.

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://www.analyticsvidhya.com/blog/2021/09/important-documents-prepared-by-a-business-analyst/

Big Data

If you did not already know

Published

on

DataOps google
DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations. From a process and methodology perspective, DataOps applies Agile software development, DevOps software development practices and the statistical process control used in lean manufacturing, to data analytics. In DataOps, development of new analytics is streamlined using Agile software development, an iterative project management methodology that replaces the traditional Waterfall sequential methodology. Studies show that software development projects complete significantly faster and with far fewer defects when Agile Development is used. The Agile methodology is particularly effective in environments where requirements are quickly evolving – a situation well known to data analytics professionals. DevOps focuses on continuous delivery by leveraging on-demand IT resources and by automating test and deployment of analytics. This merging of software development and IT operations has improved velocity, quality, predictability and scale of software engineering and deployment. Borrowing methods from DevOps, DataOps seeks to bring these same improvements to data analytics. Like lean manufacturing, DataOps utilizes statistical process control (SPC) to monitor and control the data analytics pipeline. With SPC in place, the data flowing through an operational system is constantly monitored and verified to be working. If an anomaly occurs, the data analytics team can be notified through an automated alert. DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, agility, quality, security, access and ease of use. …

CoSegNet google
We introduce CoSegNet, a deep neural network architecture for co-segmentation of a set of 3D shapes represented as point clouds. CoSegNet takes as input a set of unsegmented shapes, proposes per-shape parts, and then jointly optimizes the part labelings across the set subjected to a novel group consistency loss expressed via matrix rank estimates. The proposals are refined in each iteration by an auxiliary network that acts as a weak regularizing prior, pre-trained to denoise noisy, unlabeled parts from a large collection of segmented 3D shapes, where the part compositions within the same object category can be highly inconsistent. The output is a consistent part labeling for the input set, with each shape segmented into up to K (a user-specified hyperparameter) parts. The overall pipeline is thus weakly supervised, producing consistent segmentations tailored to the test set, without consistent ground-truth segmentations. We show qualitative and quantitative results from CoSegNet and evaluate it via ablation studies and comparisons to state-of-the-art co-segmentation methods. …

Stochastic Computation Graph (SCG) google
Stochastic computation graphs are directed acyclic graphs that encode the dependency structure of computation to be performed. The graphical notation generalizes directed graphical models. …

Smooth Density Spatial Quantile Regression google
We derive the properties and demonstrate the desirability of a model-based method for estimating the spatially-varying effects of covariates on the quantile function. By modeling the quantile function as a combination of I-spline basis functions and Pareto tail distributions, we allow for flexible parametric modeling of the extremes while preserving non-parametric flexibility in the center of the distribution. We further establish that the model guarantees the desired degree of differentiability in the density function and enables the estimation of non-stationary covariance functions dependent on the predictors. We demonstrate through a simulation study that the proposed method produces more efficient estimates of the effects of predictors than other methods, particularly in distributions with heavy tails. To illustrate the utility of the model we apply it to measurements of benzene collected around an oil refinery to determine the effect of an emission source within the refinery on the distribution of the fence line measurements. …

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://analytixon.com/2021/10/24/if-you-did-not-already-know-1540/

Continue Reading

Big Data

If you did not already know

Published

on

DataOps google
DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations. From a process and methodology perspective, DataOps applies Agile software development, DevOps software development practices and the statistical process control used in lean manufacturing, to data analytics. In DataOps, development of new analytics is streamlined using Agile software development, an iterative project management methodology that replaces the traditional Waterfall sequential methodology. Studies show that software development projects complete significantly faster and with far fewer defects when Agile Development is used. The Agile methodology is particularly effective in environments where requirements are quickly evolving – a situation well known to data analytics professionals. DevOps focuses on continuous delivery by leveraging on-demand IT resources and by automating test and deployment of analytics. This merging of software development and IT operations has improved velocity, quality, predictability and scale of software engineering and deployment. Borrowing methods from DevOps, DataOps seeks to bring these same improvements to data analytics. Like lean manufacturing, DataOps utilizes statistical process control (SPC) to monitor and control the data analytics pipeline. With SPC in place, the data flowing through an operational system is constantly monitored and verified to be working. If an anomaly occurs, the data analytics team can be notified through an automated alert. DataOps is not tied to a particular technology, architecture, tool, language or framework. Tools that support DataOps promote collaboration, orchestration, agility, quality, security, access and ease of use. …

CoSegNet google
We introduce CoSegNet, a deep neural network architecture for co-segmentation of a set of 3D shapes represented as point clouds. CoSegNet takes as input a set of unsegmented shapes, proposes per-shape parts, and then jointly optimizes the part labelings across the set subjected to a novel group consistency loss expressed via matrix rank estimates. The proposals are refined in each iteration by an auxiliary network that acts as a weak regularizing prior, pre-trained to denoise noisy, unlabeled parts from a large collection of segmented 3D shapes, where the part compositions within the same object category can be highly inconsistent. The output is a consistent part labeling for the input set, with each shape segmented into up to K (a user-specified hyperparameter) parts. The overall pipeline is thus weakly supervised, producing consistent segmentations tailored to the test set, without consistent ground-truth segmentations. We show qualitative and quantitative results from CoSegNet and evaluate it via ablation studies and comparisons to state-of-the-art co-segmentation methods. …

Stochastic Computation Graph (SCG) google
Stochastic computation graphs are directed acyclic graphs that encode the dependency structure of computation to be performed. The graphical notation generalizes directed graphical models. …

Smooth Density Spatial Quantile Regression google
We derive the properties and demonstrate the desirability of a model-based method for estimating the spatially-varying effects of covariates on the quantile function. By modeling the quantile function as a combination of I-spline basis functions and Pareto tail distributions, we allow for flexible parametric modeling of the extremes while preserving non-parametric flexibility in the center of the distribution. We further establish that the model guarantees the desired degree of differentiability in the density function and enables the estimation of non-stationary covariance functions dependent on the predictors. We demonstrate through a simulation study that the proposed method produces more efficient estimates of the effects of predictors than other methods, particularly in distributions with heavy tails. To illustrate the utility of the model we apply it to measurements of benzene collected around an oil refinery to determine the effect of an emission source within the refinery on the distribution of the fence line measurements. …

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://analytixon.com/2021/10/24/if-you-did-not-already-know-1540/

Continue Reading

Big Data

If you did not already know

Published

on

Correntropy google
Correntropy is a nonlinear similarity measure between two random variables.
Learning with the Maximum Correntropy Criterion Induced Losses for Regression


Patient Event Graph (PatientEG) google
Medical activities, such as diagnoses, medicine treatments, and laboratory tests, as well as temporal relations between these activities are the basic concepts in clinical research. However, existing relational data model on electronic medical records (EMRs) lacks explicit and accurate semantic definitions of these concepts. It leads to the inconvenience of query construction and the inefficiency of query execution where multi-table join queries are frequently required. In this paper, we propose a patient event graph (PatientEG) model to capture the characteristics of EMRs. We respectively define five types of medical entities, five types of medical events and five types of temporal relations. Based on the proposed model, we also construct a PatientEG dataset with 191,294 events, 3,429 distinct entities, and 545,993 temporal relations using EMRs from Shanghai Shuguang hospital. To help to normalize entity values which contain synonyms, hyponymies, and abbreviations, we link them with the Chinese biomedical knowledge graph. With the help of PatientEG dataset, we are able to conveniently perform complex queries for clinical research such as auxiliary diagnosis and therapeutic effectiveness analysis. In addition, we provide a SPARQL endpoint to access PatientEG dataset and the dataset is also publicly available online. Also, we list several illustrative SPARQL queries on our website. …

LogitBoost Autoregressive Networks google
Multivariate binary distributions can be decomposed into products of univariate conditional distributions. Recently popular approaches have modeled these conditionals through neural networks with sophisticated weight-sharing structures. It is shown that state-of-the-art performance on several standard benchmark datasets can actually be achieved by training separate probability estimators for each dimension. In that case, model training can be trivially parallelized over data dimensions. On the other hand, complexity control has to be performed for each learned conditional distribution. Three possible methods are considered and experimentally compared. The estimator that is employed for each conditional is LogitBoost. Similarities and differences between the proposed approach and autoregressive models based on neural networks are discussed in detail. …

Discretification google
Discretification’ is the mechanism of making continuous data discrete. If you really grasp the concept, you may be thinking ‘Wait a minute, the type of data we are collecting is discrete in and of itself! Data can EITHER be discrete OR continuous, it can’t be both!’ You would be correct. But what if we manually selected values along that continuous measurement, and declared them to be in a specific category? For instance, if we declare 72.0 degrees and greater to be ‘Hot’, 35.0-71.9 degrees to be ‘Moderate’, and anything lower than 35.0 degrees to be ‘Cold’, we have ‘discretified’ temperature! Our readings that were once continuous now fit into distinct categories. So, where we do we draw the boundaries for these categories? What makes 35.0 degrees ‘Cold’ and 35.1 degrees ‘Moderate’? At is at this juncture that the TRUE decision is being made. The beauty of approaching the challenge in this manner is that it is data-centric, not concept-centric. Let’s walk through our marketing example first without using discretification, then with it. …

PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://analytixon.com/2021/10/23/if-you-did-not-already-know-1539/

Continue Reading

Big Data

Capturing the signal of weak electricigens: a worthy endeavour

Published

on

Recently several non-traditional electroactive microorganisms have been discovered. These can be considered weak electricigens; microorganisms that typically rely on soluble electron acceptors and donors in their lifecycle but are also capable of extracellular electron transfer (EET), resulting in either a low, unreliable, or otherwise unexpected current. These unanticipated electroactive microorganisms represent a new chapter in electromicrobiology and have important medical, environmental, and biotechnological relevance.
PlatoAi. Web3 Reimagined. Data Intelligence Amplified.
Click here to access.

Source: https://www.cell.com/trends/biotechnology/fulltext/S0167-7799(21)00229-8?rss=yes

Continue Reading
Blockchain3 days ago

People’s payment attitude: Why cash Remains the most Common Means of Payment & How Technology and Crypto have more Advantages as a Means of payment

Automotive4 days ago

7 Secrets That Automakers Wish You Don’t Know

Startups3 days ago

The 12 TikTok facts you should know

Supply Chain3 days ago

LPG tubes – what to think about

Energy2 days ago

U Power ties up with Bosch to collaborate on Super Board technology

Gaming4 days ago

New Steam Games You Might Have Missed In August 2021

Blockchain4 days ago

What Is the Best Crypto IRA for Me? Use These 6 Pieces of Criteria to Find Out More

Gaming4 days ago

How do casinos without an account work?

IOT4 days ago

The Benefits of Using IoT SIM Card Technology

Blockchain4 days ago

The Most Profitable Cryptocurrencies on the Market

Gaming4 days ago

Norway will crack down on the unlicensed iGaming market with a new gaming law

Blockchain4 days ago

What does swapping crypto mean?

Energy2 days ago

Piperylene Market Size to Grow by USD 428.50 mn from 2020 to 2024 | Growing Demand for Piperylene-based Adhesives to Boost Growth | Technavio

Energy2 days ago

Notice of Data Security Breach Incident

AR/VR5 days ago

Preview: Little Cities – Delightful City Building on Quest

Blockchain2 days ago

Blockchain & Infrastructure Post-Event Release

Blockchain3 days ago

Week Ahead – Between a rock and a hard place

Cyber Security2 days ago

Ransomware Took a New Twist with US Leading a Law Enforcement Effort to Hack Back

Esports2 days ago

How to get Shiny Zacian and Zamazenta in Pokémon Sword and Shield

Code2 days ago

How does XML to JSON converter work?

Trending