What is Statistical Analysis? Types, Methods, Software, Examples

Appinio Research · 29.02.2024 · 31min read

What Is Statistical Analysis Types Methods Software Examples

Ever wondered how we make sense of vast amounts of data to make informed decisions? Statistical analysis is the answer. In our data-driven world, statistical analysis serves as a powerful tool to uncover patterns, trends, and relationships hidden within data. From predicting sales trends to assessing the effectiveness of new treatments, statistical analysis empowers us to derive meaningful insights and drive evidence-based decision-making across various fields and industries.

In this guide, we'll explore the fundamentals of statistical analysis, popular methods, software tools, practical examples, and best practices to help you harness the power of statistics effectively. Whether you're a novice or an experienced analyst, this guide will equip you with the knowledge and skills to navigate the world of statistical analysis with confidence.

 

What is Statistical Analysis?

Statistical analysis is a methodical process of collecting, analyzing, interpreting, and presenting data to uncover patterns, trends, and relationships. It involves applying statistical techniques and methodologies to make sense of complex data sets and draw meaningful conclusions.

Importance of Statistical Analysis

Statistical analysis plays a crucial role in various fields and industries due to its numerous benefits and applications:

  • Informed Decision Making: Statistical analysis provides valuable insights that inform decision-making processes in business, healthcare, government, and academia. By analyzing data, organizations can identify trends, assess risks, and optimize strategies for better outcomes.
  • Evidence-Based Research: Statistical analysis is fundamental to scientific research, enabling researchers to test hypotheses, draw conclusions, and validate theories using empirical evidence. It helps researchers quantify relationships, assess the significance of findings, and advance knowledge in their respective fields.
  • Quality Improvement: In manufacturing and quality management, statistical analysis helps identify defects, improve processes, and enhance product quality. Techniques such as Six Sigma and Statistical Process Control (SPC) are used to monitor performance, reduce variation, and achieve quality objectives.
  • Risk Assessment: In finance, insurance, and investment, statistical analysis is used for risk assessment and portfolio management. By analyzing historical data and market trends, analysts can quantify risks, forecast outcomes, and make informed decisions to mitigate financial risks.
  • Predictive Modeling: Statistical analysis enables predictive modeling and forecasting in various domains, including sales forecasting, demand planning, and weather prediction. By analyzing historical data patterns, predictive models can anticipate future trends and outcomes with reasonable accuracy.
  • Healthcare Decision Support: In healthcare, statistical analysis is integral to clinical research, epidemiology, and healthcare management. It helps healthcare professionals assess treatment effectiveness, analyze patient outcomes, and optimize resource allocation for improved patient care.

Statistical Analysis Applications

Statistical analysis finds applications across diverse domains and disciplines, including:

  • Business and Economics: Market research, financial analysis, econometrics, and business intelligence.
  • Healthcare and Medicine: Clinical trials, epidemiological studies, healthcare outcomes research, and disease surveillance.
  • Social Sciences: Survey research, demographic analysis, psychology experiments, and public opinion polls.
  • Engineering: Reliability analysis, quality control, process optimization, and product design.
  • Environmental Science: Environmental monitoring, climate modeling, and ecological research.
  • Education: Educational research, assessment, program evaluation, and learning analytics.
  • Government and Public Policy: Policy analysis, program evaluation, census data analysis, and public administration.
  • Technology and Data Science: Machine learning, artificial intelligence, data mining, and predictive analytics.

These applications demonstrate the versatility and significance of statistical analysis in addressing complex problems and informing decision-making across various sectors and disciplines.

Fundamentals of Statistics

Understanding the fundamentals of statistics is crucial for conducting meaningful analyses. Let's delve into some essential concepts that form the foundation of statistical analysis.

Basic Concepts

Statistics is the science of collecting, organizing, analyzing, and interpreting data to make informed decisions or conclusions. To embark on your statistical journey, familiarize yourself with these fundamental concepts:

  • Population vs. Sample: A population comprises all the individuals or objects of interest in a study, while a sample is a subset of the population selected for analysis. Understanding the distinction between these two entities is vital, as statistical analyses often rely on samples to draw conclusions about populations.
  • Variables: In statistics, variables represent characteristics or attributes that can take different values. They can be classified into two main types:
    • Independent Variables: Variables that are manipulated or controlled in an experiment.
    • Dependent Variables: Variables that are observed or measured in response to changes in independent variables.
  • Parameters vs. Statistics: Parameters are numerical measures that describe a population, whereas statistics are numerical measures that describe a sample. For instance, the population mean is denoted by μ (mu), while the sample mean is denoted by xÌ„ (x-bar).

Descriptive Statistics

Descriptive statistics involve methods for summarizing and describing the features of a dataset. These statistics provide insights into the central tendency, variability, and distribution of the data. Standard measures of descriptive statistics include:

  • Measures of Central Tendency: These measures, including the mean, median, and mode, provide information about the central or typical value of a dataset.
    • Mean: The arithmetic average of a set of values, calculated by summing all values and dividing by the number of observations.
    • Median: The middle value in a sorted list of observations.
    • Mode: The value that appears most frequently in a dataset.
  • Measures of Dispersion: Dispersion measures, such as the range, variance, and standard deviation, quantify the spread or variability of data points around the central tendency.
    • Range: The difference between the maximum and minimum values in a dataset.
    • Variance: The average of the squared differences from the mean.
    • Standard Deviation: The square root of the variance, providing a measure of the average distance of data points from the mean.
  • Graphical Techniques: Graphical representations, including histograms, box plots, and scatter plots, offer visual insights into the distribution and relationships within a dataset. These visualizations aid in identifying patterns, outliers, and trends.

Inferential Statistics

Inferential statistics enable researchers to draw conclusions or make predictions about populations based on sample data. These methods allow for generalizations beyond the observed data. Fundamental techniques in inferential statistics include:

  • Hypothesis Testing: Hypothesis testing involves making decisions about population parameters based on sample data. It typically follows a structured process that includes formulating hypotheses, selecting an appropriate test statistic, determining the significance level, and interpreting results.
    • Null Hypothesis (H0): The hypothesis that there is no significant difference or relationship.
    • Alternative Hypothesis (H1): The hypothesis that there is a significant difference or relationship.
  • Confidence Intervals: Confidence intervals provide a range of plausible values for a population parameter. They offer insights into the precision of sample estimates and the uncertainty associated with those estimates.
  • Regression Analysis: Regression analysis examines the relationship between one or more independent variables and a dependent variable. It allows for the prediction of the dependent variable based on the values of the independent variables.
  • Sampling Methods: Sampling methods, such as simple random sampling, stratified sampling, and cluster sampling, are employed to ensure that sample data are representative of the population of interest. These methods help mitigate biases and improve the generalizability of results.

Probability Distributions

Probability distributions describe the likelihood of different outcomes in a statistical experiment. Understanding these distributions is essential for modeling and analyzing random phenomena. Some common probability distributions include:

  • Normal Distribution: The normal distribution, also known as the Gaussian distribution, is characterized by a symmetric, bell-shaped curve. Many natural phenomena follow this distribution, making it widely applicable in statistical analysis.
  • Binomial Distribution: The binomial distribution describes the number of successes in a fixed number of independent Bernoulli trials. It is commonly used to model binary outcomes, such as success or failure, heads or tails.
  • Poisson Distribution: The Poisson distribution models the number of events occurring in a fixed interval of time or space. It is often used to analyze rare or discrete events, such as the number of customer arrivals in a queue within a given time period.

Types of Statistical Analysis

Statistical analysis encompasses a diverse range of methods and approaches, each suited to different types of data and research questions. Understanding the various types of statistical analysis is essential for selecting the most appropriate technique for your analysis. Let's explore some common distinctions in statistical analysis methods.

Parametric vs. Non-parametric Analysis

Parametric and non-parametric analyses represent two broad categories of statistical methods, each with its own assumptions and applications.

  • Parametric Analysis: Parametric methods assume that the data follow a specific probability distribution, often the normal distribution. These methods rely on estimating parameters (e.g., means, variances) from the data. Parametric tests typically provide more statistical power but require stricter assumptions.
    Examples of parametric tests include t-tests, ANOVA, and linear regression.
  • Non-parametric Analysis: Non-parametric methods make fewer assumptions about the underlying distribution of the data. Instead of estimating parameters, non-parametric tests rely on ranks or other distribution-free techniques. Non-parametric tests are often used when data do not meet the assumptions of parametric tests or when dealing with ordinal or non-normal data.
    Examples of non-parametric tests include the Wilcoxon rank-sum test, Kruskal-Wallis test, and Spearman correlation.

Descriptive vs. Inferential Analysis

Descriptive and inferential analyses serve distinct purposes in statistical analysis, focusing on summarizing data and making inferences about populations, respectively.

  • Descriptive Analysis: Descriptive statistics aim to describe and summarize the features of a dataset. These statistics provide insights into the central tendency, variability, and distribution of the data.
    Descriptive analysis techniques include measures of central tendency (e.g., mean, median, mode), measures of dispersion (e.g., variance, standard deviation), and graphical representations (e.g., histograms, box plots).
  • Inferential Analysis: Inferential statistics involve making inferences or predictions about populations based on sample data. These methods allow researchers to generalize findings from the sample to the larger population.
    Inferential analysis techniques include hypothesis testing, confidence intervals, regression analysis, and sampling methods. These methods help researchers draw conclusions about population parameters, such as means, proportions, or correlations, based on sample data.

Exploratory vs. Confirmatory Analysis

Exploratory and confirmatory analyses represent two different approaches to data analysis, each serving distinct purposes in the research process.

  • Exploratory Analysis: Exploratory data analysis (EDA) focuses on exploring data to discover patterns, relationships, and trends. EDA techniques involve visualizing data, identifying outliers, and generating hypotheses for further investigation.
    Exploratory analysis is particularly useful in the early stages of research when the goal is to gain insights and generate hypotheses rather than confirm specific hypotheses.
  • Confirmatory Analysis: Confirmatory data analysis involves testing predefined hypotheses or theories based on prior knowledge or assumptions. Confirmatory analysis follows a structured approach, where hypotheses are tested using appropriate statistical methods.
    Confirmatory analysis is common in hypothesis-driven research, where the goal is to validate or refute specific hypotheses using empirical evidence. Techniques such as hypothesis testing, regression analysis, and experimental design are often employed in confirmatory analysis.

Methods of Statistical Analysis

Statistical analysis employs various methods to extract insights from data and make informed decisions. Let's explore some of the key methods used in statistical analysis and their applications.

Hypothesis Testing

Hypothesis testing is a fundamental concept in statistics, allowing researchers to make decisions about population parameters based on sample data. The process involves formulating null and alternative hypotheses, selecting an appropriate test statistic, determining the significance level, and interpreting the results. Standard hypothesis tests include:

  • t-tests: Used to compare means between two groups.
  • ANOVA (Analysis of Variance): Extends the t-test to compare means across multiple groups.
  • Chi-square test: Assessing the association between categorical variables.

Regression Analysis

Regression analysis explores the relationship between one or more independent variables and a dependent variable. It is widely used in predictive modeling and understanding the impact of variables on outcomes. Key types of regression analysis include:

  • Simple Linear Regression: Examines the linear relationship between one independent variable and a dependent variable.
  • Multiple Linear Regression: Extends simple linear regression to analyze the relationship between multiple independent variables and a dependent variable.
  • Logistic Regression: Used for predicting binary outcomes or modeling probabilities.

Analysis of Variance (ANOVA)

ANOVA is a statistical technique used to compare means across two or more groups. It partitions the total variability in the data into components attributable to different sources, such as between-group differences and within-group variability. ANOVA is commonly used in experimental design and hypothesis testing scenarios.

Time Series Analysis

Time series analysis deals with analyzing data collected or recorded at successive time intervals. It helps identify patterns, trends, and seasonality in the data. Time series analysis techniques include:

  • Trend Analysis: Identifying long-term trends or patterns in the data.
  • Seasonal Decomposition: Separating the data into seasonal, trend, and residual components.
  • Forecasting: Predicting future values based on historical data.

Survival Analysis

Survival analysis is used to analyze time-to-event data, such as time until death, failure, or occurrence of an event of interest. It is widely used in medical research, engineering, and social sciences to analyze survival probabilities and hazard rates over time.

Factor Analysis

Factor analysis is a statistical method used to identify underlying factors or latent variables that explain patterns of correlations among observed variables. It is commonly used in psychology, sociology, and market research to uncover underlying dimensions or constructs.

Cluster Analysis

Cluster analysis is a multivariate technique that groups similar objects or observations into clusters or segments based on their characteristics. It is widely used in market segmentation, image processing, and biological classification.

Principal Component Analysis (PCA)

PCA is a dimensionality reduction technique used to transform high-dimensional data into a lower-dimensional space while preserving most of the variability in the data. It identifies orthogonal axes (principal components) that capture the maximum variance in the data. PCA is useful for data visualization, feature selection, and data compression.

How to Choose the Right Statistical Analysis Method?

Selecting the appropriate statistical method is crucial for obtaining accurate and meaningful results from your data analysis.

Understanding Data Types and Distribution

Before choosing a statistical method, it's essential to understand the types of data you're working with and their distribution. Different statistical methods are suitable for different types of data:

  • Continuous vs. Categorical Data: Determine whether your data are continuous (e.g., height, weight) or categorical (e.g., gender, race). Parametric methods such as t-tests and regression are typically used for continuous data, while non-parametric methods like chi-square tests are suitable for categorical data.
  • Normality: Assess whether your data follows a normal distribution. Parametric methods often assume normality, so if your data are not normally distributed, non-parametric methods may be more appropriate.

Assessing Assumptions

Many statistical methods rely on certain assumptions about the data. Before applying a method, it's essential to assess whether these assumptions are met:

  • Independence: Ensure that observations are independent of each other. Violations of independence assumptions can lead to biased results.
  • Homogeneity of Variance: Verify that variances are approximately equal across groups, especially in ANOVA and regression analyses. Levene's test or Bartlett's test can be used to assess homogeneity of variance.
  • Linearity: Check for linear relationships between variables, particularly in regression analysis. Residual plots can help diagnose violations of linearity assumptions.

Considering Research Objectives

Your research objectives should guide the selection of the appropriate statistical method.

  • What are you trying to achieve with your analysis?: Determine whether you're interested in comparing groups, predicting outcomes, exploring relationships, or identifying patterns.
  • What type of data are you analyzing?: Choose methods that are suitable for your data type and research questions.
  • Are you testing specific hypotheses or exploring data for insights?: Confirmatory analyses involve testing predefined hypotheses, while exploratory analyses focus on discovering patterns or relationships in the data.

Consulting Statistical Experts

If you're unsure about the most appropriate statistical method for your analysis, don't hesitate to seek advice from statistical experts or consultants:

  • Collaborate with Statisticians: Statisticians can provide valuable insights into the strengths and limitations of different statistical methods and help you select the most appropriate approach.
  • Utilize Resources: Take advantage of online resources, forums, and statistical software documentation to learn about different methods and their applications.
  • Peer Review: Consider seeking feedback from colleagues or peers familiar with statistical analysis to validate your approach and ensure rigor in your analysis.

By carefully considering these factors and consulting with experts when needed, you can confidently choose the suitable statistical method to address your research questions and obtain reliable results.

Statistical Analysis Software

Choosing the right software for statistical analysis is crucial for efficiently processing and interpreting your data. In addition to statistical analysis software, it's essential to consider tools for data collection, which lay the foundation for meaningful analysis.

What is Statistical Analysis Software?

Statistical software provides a range of tools and functionalities for data analysis, visualization, and interpretation. These software packages offer user-friendly interfaces and robust analytical capabilities, making them indispensable tools for researchers, analysts, and data scientists.

  • Graphical User Interface (GUI): Many statistical software packages offer intuitive GUIs that allow users to perform analyses using point-and-click interfaces. This makes statistical analysis accessible to users with varying levels of programming expertise.
  • Scripting and Programming: Advanced users can leverage scripting and programming capabilities within statistical software to automate analyses, customize functions, and extend the software's functionality.
  • Visualization: Statistical software often includes built-in visualization tools for creating charts, graphs, and plots to visualize data distributions, relationships, and trends.
  • Data Management: These software packages provide features for importing, cleaning, and manipulating datasets, ensuring data integrity and consistency throughout the analysis process.

Popular Statistical Analysis Software

Several statistical software packages are widely used in various industries and research domains. Some of the most popular options include:

  • R: R is a free, open-source programming language and software environment for statistical computing and graphics. It offers a vast ecosystem of packages for data manipulation, visualization, and analysis, making it a popular choice among statisticians and data scientists.
  • Python: Python is a versatile programming language with robust libraries like NumPy, SciPy, and pandas for data analysis and scientific computing. Python's simplicity and flexibility make it an attractive option for statistical analysis, particularly for users with programming experience.
  • SPSS: SPSS (Statistical Package for the Social Sciences) is a comprehensive statistical software package widely used in social science research, marketing, and healthcare. It offers a user-friendly interface and a wide range of statistical procedures for data analysis and reporting.
  • SAS: SAS (Statistical Analysis System) is a powerful statistical software suite used for data management, advanced analytics, and predictive modeling. SAS is commonly employed in industries such as healthcare, finance, and government for data-driven decision-making.
  • Stata: Stata is a statistical software package that provides tools for data analysis, manipulation, and visualization. It is popular in academic research, economics, and social sciences for its robust statistical capabilities and ease of use.
  • MATLAB: MATLAB is a high-level programming language and environment for numerical computing and visualization. It offers built-in functions and toolboxes for statistical analysis, machine learning, and signal processing.

Data Collection Software

In addition to statistical analysis software, data collection software plays a crucial role in the research process. These tools facilitate data collection, management, and organization from various sources, ensuring data quality and reliability.

 

When it comes to data collection, precision and efficiency are paramount. Appinio offers a seamless solution for gathering real-time consumer insights, empowering you to make informed decisions swiftly. With our intuitive platform, you can define your target audience with precision, launch surveys effortlessly, and access valuable data in minutes.

 

Experience the power of Appinio and elevate your data collection process today. Ready to see it in action? Book a demo now!

How to Choose the Right Statistical Analysis Software?

When selecting software for statistical analysis and data collection, consider the following factors:

  • Compatibility: Ensure the software is compatible with your operating system, hardware, and data formats.
  • Usability: Choose software that aligns with your level of expertise and provides features that meet your analysis and data collection requirements.
  • Integration: Consider whether the software integrates with other tools and platforms in your workflow, such as data visualization software or data storage systems.
  • Cost and Licensing: Evaluate the cost of licensing or subscription fees, as well as any additional costs for training, support, or maintenance.

By carefully evaluating these factors and considering your specific analysis and data collection needs, you can select the right software tools to support your research objectives and drive meaningful insights from your data.

Statistical Analysis Examples

Understanding statistical analysis methods is best achieved through practical examples. Let's explore three examples that demonstrate the application of statistical techniques in real-world scenarios.

Example 1: Linear Regression

Scenario: A marketing analyst wants to understand the relationship between advertising spending and sales revenue for a product.

 

Data: The analyst collects data on monthly advertising expenditures (in dollars) and corresponding sales revenue (in dollars) over the past year.

 

Analysis: Using simple linear regression, the analyst fits a regression model to the data, where advertising spending is the independent variable (X) and sales revenue is the dependent variable (Y). The regression analysis estimates the linear relationship between advertising spending and sales revenue, allowing the analyst to predict sales based on advertising expenditures.

 

Result: The regression analysis reveals a statistically significant positive relationship between advertising spending and sales revenue. For every additional dollar spent on advertising, sales revenue increases by an estimated amount (slope coefficient). The analyst can use this information to optimize advertising budgets and forecast sales performance.

Example 2: Hypothesis Testing

Scenario: A pharmaceutical company develops a new drug intended to lower blood pressure. The company wants to determine whether the new drug is more effective than the existing standard treatment.

 

Data: The company conducts a randomized controlled trial (RCT) involving two groups of participants: one group receives the new drug, and the other receives the standard treatment. Blood pressure measurements are taken before and after the treatment period.

 

Analysis: The company uses hypothesis testing, specifically a two-sample t-test, to compare the mean reduction in blood pressure between the two groups. The null hypothesis (H0) states that there is no difference in the mean reduction in blood pressure between the two treatments, while the alternative hypothesis (H1) suggests that the new drug is more effective.

 

Result: The t-test results indicate a statistically significant difference in the mean reduction in blood pressure between the two groups. The company concludes that the new drug is more effective than the standard treatment in lowering blood pressure, based on the evidence from the RCT.

Example 3: ANOVA

Scenario: A researcher wants to compare the effectiveness of three different teaching methods on student performance in a mathematics course.

 

Data: The researcher conducts an experiment where students are randomly assigned to one of three groups: traditional lecture-based instruction, active learning, or flipped classroom. At the end of the semester, students' scores on a standardized math test are recorded.

 

Analysis: The researcher performs an analysis of variance (ANOVA) to compare the mean test scores across the three teaching methods. ANOVA assesses whether there are statistically significant differences in mean scores between the groups.

 

Result: The ANOVA results reveal a significant difference in mean test scores between the three teaching methods. Post-hoc tests, such as Tukey's HSD (Honestly Significant Difference), can be conducted to identify which specific teaching methods differ significantly from each other in terms of student performance.

 

These examples illustrate how statistical analysis techniques can be applied to address various research questions and make data-driven decisions in different fields. By understanding and applying these methods effectively, researchers and analysts can derive valuable insights from their data to inform decision-making and drive positive outcomes.

Statistical Analysis Best Practices

Statistical analysis is a powerful tool for extracting insights from data, but it's essential to follow best practices to ensure the validity, reliability, and interpretability of your results.

  1. Clearly Define Research Questions: Before conducting any analysis, clearly define your research questions or objectives. This ensures that your analysis is focused and aligned with the goals of your study.
  2. Choose Appropriate Methods: Select statistical methods suitable for your data type, research design, and objectives. Consider factors such as data distribution, sample size, and assumptions of the chosen method.
  3. Preprocess Data: Clean and preprocess your data to remove errors, outliers, and missing values. Data preprocessing steps may include data cleaning, normalization, and transformation to ensure data quality and consistency.
  4. Check Assumptions: Verify that the assumptions of the chosen statistical methods are met. Assumptions may include normality, homogeneity of variance, independence, and linearity. Conduct diagnostic tests or exploratory data analysis to assess assumptions.
  5. Transparent Reporting: Document your analysis procedures, including data preprocessing steps, statistical methods used, and any assumptions made. Transparent reporting enhances reproducibility and allows others to evaluate the validity of your findings.
  6. Consider Sample Size: Ensure that your sample size is sufficient to detect meaningful effects or relationships. Power analysis can help determine the minimum sample size required to achieve adequate statistical power.
  7. Interpret Results Cautiously: Interpret statistical results with caution and consider the broader context of your research. Be mindful of effect sizes, confidence intervals, and practical significance when interpreting findings.
  8. Validate Findings: Validate your findings through robustness checks, sensitivity analyses, or replication studies. Cross-validation and bootstrapping techniques can help assess the stability and generalizability of your results.
  9. Avoid P-Hacking and Data Dredging: Guard against p-hacking and data dredging by pre-registering hypotheses, conducting planned analyses, and avoiding selective reporting of results. Maintain transparency and integrity in your analysis process.

By following these best practices, you can conduct rigorous and reliable statistical analyses that yield meaningful insights and contribute to evidence-based decision-making in your field.

Conclusion for Statistical Analysis

Statistical analysis is a vital tool for making sense of data and guiding decision-making across diverse fields. By understanding the fundamentals of statistical analysis, including concepts like hypothesis testing, regression analysis, and data visualization, you gain the ability to extract valuable insights from complex datasets. Moreover, selecting the appropriate statistical methods, choosing the right software, and following best practices ensure the validity and reliability of your analyses.

In today's data-driven world, the ability to conduct rigorous statistical analysis is a valuable skill that empowers individuals and organizations to make informed decisions and drive positive outcomes. Whether you're a researcher, analyst, or decision-maker, mastering statistical analysis opens doors to new opportunities for understanding the world around us and unlocking the potential of data to solve real-world problems.

How to Collect Data for Statistical Analysis in Minutes?

Introducing Appinio, your gateway to effortless data collection for statistical analysis. As a real-time market research platform, Appinio specializes in delivering instant consumer insights, empowering businesses to make swift, data-driven decisions.

 

With Appinio, conducting your own market research is not only feasible but also exhilarating. Here's why:

  • Obtain insights in minutes, not days: From posing questions to uncovering insights, Appinio accelerates the entire research process, ensuring rapid access to valuable data.
  • User-friendly interface: No advanced degrees required! Our platform is designed to be intuitive and accessible to anyone, allowing you to dive into market research with confidence.
  • Targeted surveys, global reach: Define your target audience with precision using our extensive array of demographic and psychographic characteristics, and reach respondents in over 90 countries effortlessly.

 

Join the loop 💌

Be the first to hear about new updates, product news, and data insights. We'll send it all straight to your inbox.

You can call this via showToast(message, { variant: 'normal' | 'error' }) function