Introduction to IoT – Chapter 3 Quiz Answers

1. True or False?
Data is defined as big data if it is more than 1 petabyte.

  • true
  • false

Explanation: Refer to curriculum topic: 3.1.1
False. There is no minimum required for data to be considered big data.

2. What type of data is being collected when an organization is using spreadsheets and forms for data input?

  • structured data
  • application data
  • unstructured data
  • raw data

Explanation: Refer to curriculum topic: 3.1.3
Collected data is categorized as either structured or unstructured data. Structured data is created by applications that use ‘fixed’ format input such as spreadsheets. Unstructured data is generated in ‘freeform’ style such as tweets or audio files.

3. True of False?
Structured data is easier to analyze and store than unstructured data.

  • true
  • false

Explanation: Refer to curriculum topic: 3.1.2
Unstructured data, in contrast to structured data, is generated in a freeform style and is more difficult to store and analyze.

4. A business is analyzing sales information as the sales data is being generated from the point-of-sale (POS) machines. What type of data is captured and processed as the events happen?

  • saved data
  • raw data
  • transactional data
  • analytical data

Explanation: Refer to curriculum topic: 3.1.3
There are two primary types of processed data, transactional and analytical. Transactional information is captured and processed as events happen. Analytical information supports managerial analysis using numerical analysis and decision making.

5. True or False?
Web scraping tools are used to extract and manipulate structured data.

  • false
  • true

Explanation: Refer to curriculum topic: 3.1.3
Collected data can be categorized as structured or unstructured. Both categories of data can be collected from different file formats that may not necessarily be compatible with one another. Structured data may be manipulated using comma-separated values (CSV), whereas web scraping tools are used to extract unstructured data.

6. What are three features of Hadoop? (Choose three.)

  • requires proprietary software
  • uses HDFS as a fault tolerant file system
  • must run on a single virtual machine
  • easily scalable cluster sizes
  • automatic replication of data across clusters

Explanation: Refer to curriculum topic: 3.1.2
Hadoop is an open-source distributed storage solution for big data management. It easily scales cluster s, provides automatic replication of data across clusters, and uses HDFS as a fault tolerant file system.

7. What characterizes data management problems associated with big data storage?

  • ensuring that data is accessible from anywhere at anytime
  • generating and collecting data from multiple sources
  • maintaining the integrity of stored data
  • making data only available to authorized users

Explanation: Refer to curriculum topic: 3.1.2
Because data can be generated and collected from multiple different sources, a management system must be used to organize and collate all of the data sources.

8. Which type of data can be used by a business to support managerial analysis tasks and organizational decision making?

  • transactional data
  • analyzed data
  • saved data
  • raw data

Explanation: Refer to curriculum topic: 3.1.3
There are two primary types of processed data, transactional and analytical. Transactional information is captured and processed as events happen. Transactional information is used to analyze daily sales reports and production schedules to determine how much inventory to carry. Analytical information supports managerial analysis tasks like determining whether the organization should build a new manufacturing plant or hire additional sales personnel.

9. An organization is concerned with the amount of sensor data that is being generated locally, analyzed in the cloud, and returned for processing at the local site. Which solution will keep the data closer to the source for preprocessing?

  • distributed processing
  • cloud computing
  • fog computing
  • data mining

Explanation: Refer to curriculum topic: 3.1.2
Fog computing is designed to preprocess data close to the source of the data at end-user or edge devices.

10. Which attribute of big data involves an exponential data growth rate?

  • variety
  • value
  • volume
  • velocity

Explanation: Refer to curriculum topic: 3.1.1
The characteristic of velocity refers to the amount of data that is growing exponentially fast.

11. Which challenge of big data storage is characterized by the need to make data accessible from anywhere at anytime?

  • analytics
  • management
  • access
  • security

Explanation: Refer to curriculum topic: 3.1.2
Access refers to the characteristic of big data needing to be accessible from anywhere at anytime.

12. What is the process of discovering patterns and relationships in large data sets in order to convert raw data into meaningful information?

  • data selection
  • data querying
  • data mining
  • data formatting

Explanation: Refer to curriculum topic: 3.1.3
Data mining is the process of turning raw data into meaningful information by discovering patterns and relationships in large data sets. To be of value, the mined data is analyzed and presented to decision makers.

13. What is cloud computing?

  • a comprehensive ecosystem of open-source software for big data management
  • an architecture that utilizes edge devices for data preprocessing and storage
  • a system of data centers or connected servers that provide anywhere/anytime access to data and applications
  • a process that turns raw data into meaningful information using patterns and relationships in data sets

Explanation: Refer to curriculum topic: 3.1.2
Cloud computing uses data centers and groups of connected servers to provide users with access to data and applications anywhere, anytime, and on any device.

14. True or False?
Distributed data processing involves large databases being centrally processed by powerful mainframe computers and stored in giant disk arrays.

  • true
  • false

Explanation: Refer to curriculum topic: 3.1.2
Distributed data processing involves breaking large data volumes into smaller pieces that are distributed to many computers for processing.

15. Making data accessible to only authorized users addresses which data storage problem associated with big data?

  • redundancy
  • security
  • management
  • access

Explanation: Refer to curriculum topic: 3.1.2
Stored data must be kept secure and only accessible by authorized users or it potentially loses its value.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments