Collecting, Analysing, and Interpreting Quantitative Data
- admin
- Apr 15, 2021
- 12 min read
Members of group 3 :
1. Irma Imroatina (2018111043)
2. Alfa Fitri Fajriyati (20181111047)
3. Auliatuz Zahrotul Jannah (20181111053)
Data Collection
Data collection is defined as the procedure of collecting, measuring and analyzing accurate insights for research using standard validated techniques. A researcher can evaluate their hypothesis on the basis of collected data. In most cases, data collection is the primary and most important step for research, irrespective of the field of research. The approach of data collection is different for different fields of study, depending on the required information.
The most critical objective of data collection is ensuring that information-rich and reliable data is collected for statistical analysis so that data-driven decisions can be made for research.
Quantitative Data Collection
What is the need for quantitative data collection?
In contrast to qualitative data, quantitative data is everything about figures and numbers. Researchers often rely on quantitative data when they intend to quantify attributes, attitudes, behaviors, and other defined variables with a motive to either back or oppose the hypothesis of a specific phenomenon by contextualizing the data obtained via surveying or interviewing the study sample. As a researcher, you do have the option to opt either for data collection online or use traditional data collection methods via appropriate research. However, you will need computational, statistical, and mathematical tools to derive results from the collected quantitative data.
Methods used for quantitative data collection

A data that can be counted or expressed in numerical’s constitute the quantitative data. It is commonly used to study the events or levels of concurrence. And is collected through a structured questionnaire asking questions starting with “how much” or “how many.” As the quantitative data is numerical, it represents both definitive and objective data. Furthermore, quantitative information is much sorted for statistical and mathematical analysis, making it possible to illustrate it in the form of charts and graphs.
Discrete and continuous are the two major categories of quantitative data where discreet data have finite numbers and the constant data values falling on a continuum possessing the possibility to have fractions or decimals. If research is conducted to find out the number of vehicles owned by the American household, then we get a whole number, which is an excellent example of discrete data. When research is limited to the study of physical measurements of the population like height, weight, age, or distance, then the result is an excellent example of continuous data.
Any traditional or online data collection method that helps in gathering numerical data is a proven method of collecting quantitative data.
1. Probability sampling
A definitive method of sampling carried out by utilizing some form of random selection and enabling researchers to make a probability statement based on data collected at random from the targeted demographic. One of the best things about probability sampling is it allows researchers to collect the data from representatives of the population they are interested in studying. Besides, the data is collected randomly from the selected sample rules out the possibility of sampling bias.
There are three significant types of probability sampling
Simple random sampling: More often, the targeted demographic is chosen for inclusion in the sample.
Systematic random sampling: Any of the targeted demographic would be included in the sample, but only the first unit for inclusion in the sample is selected randomly, rest are selected in the ordered fashion as if one out of every ten people on the list.
Stratified random sampling: It allows selecting each unit from a particular group of the targeted audience while creating a sample. It is useful when the researchers are selective about including a specific set of people in the sample, i.e., only males or females, managers or executives, people working within a particular industry.
2. Interviews
Interviewing people is a standard method used for data collection. However, the interviews conducted to collect quantitative data are more structured, wherein the researchers ask only a standard set of questionnaires and nothing more than that.
There are three major types of interviews conducted for data collection
Telephone interviews: For years, telephone interviews ruled the charts of data collection methods. However, nowadays, there is a significant rise in conducting video interviews using the internet, Skype, or similar online video calling platforms.
Face-to-face interviews: It is a proven technique to collect data directly from the participants. It helps in acquiring quality data as it provides a scope to ask detailed questions and probing further to collect rich and informative data. Literacy requirements of the participant are irrelevant as F2F interviews offer ample opportunities to collect non-verbal data through observation or to explore complex and unknown issues. Although it can be an expensive and time-consuming method, the response rates for F2F interviews are often higher.
Computer-Assisted Personal Interviewing (CAPI): It is nothing but a similar setup of the face-to-face interview where the interviewer carries a desktop or laptop along with him at the time of interview to upload the data obtained from the interview directly into the database. CAPI saves a lot of time in updating and processing the data and also makes the entire process paperless as the interviewer does not carry a bunch of papers and questionnaires.
3. Surveys/questionnaires
Surveys or questionnaires created using online survey software are playing a pivotal role in online data collection be is quantitative or qualitative research. The surveys are designed in a manner to legitimize the behavior and trust of the respondents. More often, checklists and rating scale type of questions make the bulk of quantitative surveys as it helps in simplifying and quantifying the attitude or behavior of the respondents.
There are two significant types of survey questionnaires used to collect online data for quantitative market research.
Web-based questionnaire: This is one of the ruling and most trusted methods for internet-based research or online research. In a web-based questionnaire, the receive an email containing the survey link, clicking on which takes the respondent to a secure online survey tool from where he/she can take the survey or fill in the survey questionnaire. Being a cost-efficient, quicker, and having a wider reach, web-based surveys are more preferred by the researchers. The primary benefit of a web-based questionnaire is flexibility; respondents are free to take the survey in their free time using either a desktop, laptop, tablet, or mobile.
Mail Questionnaire: In a mail questionnaire, the survey is mailed out to a host of the sample population, enabling the researcher to connect with a wide range of audiences. The mail questionnaire typically consists of a packet containing a cover sheet that introduces the audience about the type of research and reason why it is being conducted along with a prepaid return to collect data online. Although the mail questionnaire has a higher churn rate compared to other quantitative data collection methods, adding certain perks such as reminders and incentives to complete the survey help in drastically improving the churn rate. One of the major benefits of the mail questionnaire is all the responses are anonymous, and respondents are allowed to take as much time as they want to complete the survey and be completely honest about the answer without the fear of prejudice.
4. Observation
As the name suggests, it is a pretty simple and straightforward method of collecting quantitative data. In this method, researchers collect quantitative data through systematic observations by using techniques like counting the number of people present at the specific event at a particular time and a particular venue or number of people attending the event in a designated place. More often, for quantitative data collection, the researchers have a naturalistic observation approach that needs keen observation skills and senses for getting the numerical data about the “what” and not about “why” and ”how.”
Naturalistic observation is used to collect both types of data; qualitative and quantitative. However, structured observation is more used to collect quantitative rather than qualitative data.
Structured observation: In this type of observation method, the researcher has to make careful observations of one or more specific behaviors in a more comprehensive or structured setting compared to naturalistic or participant observation. In a structured observation, the researchers, rather than observing everything, focus only on very specific behaviors of interest. It allows them to quantify the behaviors they are observing. When the observations require a judgment on the part of the observers – it is often described as coding, which requires a clearly defining a set of target behaviors.
5. Document Review
Document review is a process used to collect data after reviewing the existing documents. It is an efficient and effective way of gathering data as documents are manageable and are the practical resource to get qualified data from the past. Apart from strengthening and supporting the research by providing supplementary research data document review has emerged as one of the beneficial methods to gather quantitative research data.
Three primary document types are being analyzed for collecting supporting quantitative research data
Public Records: Under this document review, official, ongoing records of an organization are analyzed for further research. For example, annual reports policy manuals, student activities, game activities in the university, etc.
Personal Documents: In contrast to public documents, this type of document review deals with individual personal accounts of individuals’ actions, behavior, health, physique, etc. For example, the height and weight of the students, distance students are traveling to attend the school, etc.
Physical Evidence: Physical evidence or physical documents deal with previous achievements of an individual or of an organization in terms of monetary and scalable growth.
Data Analysis
Data analysis is the process of collecting, modeling, and analyzing data to extract insights that support decision-making. There are several methods and techniques to perform analysis depending on the industry and the aim of the analysis. All these various methods for data analysis are largely based on two core areas: quantitative methods and qualitative methods in research.
Quantitative data is defined as the value of data in the form of counts or numbers where each data-set has an unique numerical value associated with it. This data is any quantifiable information that can be used for mathematical calculations and statistical analysis, such that real-life decisions can be made based on these mathematical derivations. Quantitative data is used to answer questions such as “How many?”, “How often?”, “How much?”. This data can be verified and can also be conveniently evaluated using mathematical techniques. Any data expressed in numbers of numerical figures are called quantitative data. This type of data can be distinguished into categories, grouped, measured, calculated, or ranked. Example: questions such as age, rank, cost, length, weight, scores, etc. everything comes under this type of data.
Quantitative Data : Analysis Methods
Data collection forms a major part of the research process. This data however has to be analyzed to make sense of. There are multiple methods of analyzing quantitative data collected in surveys. They are:
Cross-tabulation: Cross-tabulation is the most widely used quantitative data analysis methods. It is a preferred method since it uses a basic tabular form to draw inferences between different data-sets in the research study. It contains data that is mutually exclusive or have some connection with each other.
Trend analysis: Trend analysis is a statistical analysis method that provides the ability to look at quantitative data that has been collected over a long period of time. This data analysis method helps collect feedback about data changes over time and if aims to understand the change in variables considering one variable remains unchanged.
MaxDiff analysis: The MaxDiff analysis is a quantitative data analysis method that is used to gauge customer preferences for a purchase and what parameters rank higher than the others in this process. In a simplistic form, this method is also called the “best-worst” method. This method is very similar to conjoint analysis but is much easier to implement and can be interchangeably used.
Conjoint analysis: Like in the above method, conjoint analysis is a similar quantitative data analysis method that analyzes parameters behind a purchasing decision. This method possesses the ability to collect and analyze advanced metrics which provide an in-depth insight into purchasing decisions as well as the parameters that rank the most important.
TURF analysis: TURF analysis or Total Unduplicated Reach and Frequency Analysis, is a quantitative data analysis methodology that assesses the total market reach of a product or service or a mix of both. This method is used by organizations to understand the frequency and the avenues at which their messaging reaches customers and prospective customers which helps them tweak their go-to-market strategies.
Gap analysis: Gap analysis uses a side-by-side matrix to depict quantitative data that helps measure the difference between expected performance and actual performance. This data analysis helps measure gaps in performance and the things that are required to be done to bridge this gap.
SWOT analysis: SWOT analysis, is a quantitative data analysis methods that assigns numerical values to indicate strength, weaknesses, opportunities and threats of an organization or product or service which in turn provides a holistic picture about competition. This method helps to create effective business strategies.
Text analysis: Text analysis is an advanced statistical method where intelligent tools make sense of and quantify or fashion qualitative and open-ended data into easily understandable data. This method is used when the raw survey data is unstructured but has to be brought into a structure that makes sense.
Steps To Conduct Quantitative Data Analysis
For Quantitative Data, raw information has to presented in meaningful manner using analysis methods. Quantitative data should be analyzed in order to find evidential data that would help in the research process.
Relate measurement scales with variables: Associate measurement scales such as Nominal, Ordinal, Interval and Ratio with the variables. This step is important to arrange the data in proper order. Data can be entered into an excel sheet to organize it in a specific format.
Connect descriptive statistics with data: Link descriptive statistics to encapsulate available data. It can be difficult to establish a pattern in the raw data. Some widely used descriptive statistics are:
Mean- An average of values for a specific variable
Median- A midpoint of the value scale for a variable
Mode- For a variable, the most common value
Frequency- Number of times a particular value is observed in the scale
Minimum and Maximum Values- Lowest and highest values for a scale
Percentages- Format to express scores and set of values for variables
Decide a measurement scale: It is important to decide the measurement scale to conclude a descriptive statistics for the variable. For instance, a nominal variable score will never have a mean or median and so the descriptive statistics will correspondingly vary. Descriptive statistics suffice in situations where the results are not to be generalized to the population.
Select appropriate tables to represent data and analyze collected data: After deciding on a suitable measurement scale, researchers can use a tabular format to represent data. This data can be analyzed using various techniques such as Cross-tabulation or TURF.
What Is Data Interpretation?
Data interpretation refers to the implementation of processes through which data is reviewed for the purpose of arriving at an informed conclusion. The interpretation of data assigns a meaning to the information analyzed and determines its signification and implications.
The importance of data interpretation is evident and this is why it needs to be done properly. Data is very likely to arrive from multiple sources and has a tendency to enter the analysis process with haphazard ordering.
How To Interpret Data?
When interpreting data, an analyst must try to discern the differences between correlation, causation and coincidences, as well as many other bias – but he also has to consider all the factors involved that may have led to a result. There are various data interpretation methods one can use.
The interpretation of data is designed to help people make sense of numerical data that has been collected, analyzed and presented. Having a baseline method (or methods) for interpreting data will provide your analyst teams a structure and consistent foundation. Indeed, if several departments have different approaches to interpret the same data, while sharing the same goals, some mismatched objectives can result. Disparate methods will lead to duplicated efforts, inconsistent solutions, wasted energy and inevitably – time and money.
Quantitative Data Interpretation
If quantitative data interpretation could be summed up in one word (and it really can’t) that word would be “numerical.” There are few certainties when it comes to data analysis, but you can be sure that if the research you are engaging in has no numbers involved, it is not quantitative research. Quantitative analysis refers to a set of processes by which numerical data is analyzed. More often than not, it involves the use of statistical modeling such as standard deviation, mean and median. Let’s quickly review the most common statistical terms:
Mean: a mean represents a numerical average for a set of responses. When dealing with a data set (or multiple data sets), a mean will represent a central value of a specific set of numbers. It is the sum of the values divided by the number of values within the data set. Other terms that can be used to describe the concept are arithmetic mean, average and mathematical expectation.
Standard deviation: this is another statistical term commonly appearing in quantitative analysis. Standard deviation reveals the distribution of the responses around the mean. It describes the degree of consistency within the responses; together with the mean, it provides insight into data sets.
Frequency distribution: this is a measurement gauging the rate of a response appearance within a data set. When using a survey, for example, frequency distribution has the capability of determining the number of times a specific ordinal scale response appears (i.e., agree, strongly agree, disagree, etc.). Frequency distribution is extremely keen in determining the degree of consensus among data points.
Typically, quantitative data is measured by visually presenting correlation tests between two or more variables of significance. Different processes can be used together or separately, and comparisons can be made to ultimately arrive at a conclusion. Other signature interpretation processes of quantitative data include:
- Regression analysis
- Cohort analysis
- Predictive and prescriptive analysis
Conclusion
Quantitative data is not about convergent reasoning, but it is about divergent thinking. It deals with the numerical, logic, and an objective stance, by focusing on numeric and unchanging data. More often, data collection methods are used to collect quantitative research data, and the results are dependent on the larger sample sizes that are commonly representing the population researcher intend to study.
Although there are many other methods to collect quantitative data, those mentioned above probability sampling, interviews, questionnaire observation, and document review are the most common and widely used methods either offline or for online data collection.
Quantitative data research is comprehensive, and perhaps the only data type that could display analytic results in charts and graphs. Quality data will give you precise results, and data analysis is probably the essential component, which will not only hamper the integrity and authenticity of your research but will also make the findings unstable if you have weak data. Therefore, it does not matter what method you chose to collect quantitative data, ensure that the data collected is of good quality to provide insightful and actionable insights.
Sources
Please kindly check the following files:





Thanks a Millions, it was bloody wonderful 👍
Very detailed! Thank you group 3..
That’s great. The essay is explained well and each paragraph is unified around a clear main point. You should also paraphrase your essay because it is the same as what I have read in your references.
Very good. You have written a comprehensive paper about the given topic, so that all the readers get clear information about how to conduct quantitative research.
the power point slide is also brief and not wordy. well done students.