Filters
Question type

Study Flashcards

Which Big Data approach promotes efficiency,lower cost,and better performance by processing jobs in a shared,centrally managed pool of IT resources?


A) in-memory analytics
B) in-database analytics
C) grid computing
D) appliances

Correct Answer

verifed

verified

How does Hadoop work?


A) It integrates Big Data into a whole so large data elements can be processed as a whole on one computer.
B) It integrates Big Data into a whole so large data elements can be processed as a whole on multiple computers.
C) It breaks up Big Data into multiple parts so each part can be processed and analyzed at the same time on one computer.
D) It breaks up Big Data into multiple parts so each part can be processed and analyzed at the same time on multiple computers.

Correct Answer

verifed

verified

Allowing Big Data to be processed in memory and distributed across a dedicated set of nodes can solve complex problems in near-real time with highly accurate insights.What is this process called?


A) in-memory analytics
B) in-database analytics
C) grid computing
D) appliances

Correct Answer

verifed

verified

In the Big Data and Analytics in Politics case study,what was the analytic system output or goal?


A) census data
B) assessment of sentiment
C) voter mobilization
D) group clustering

Correct Answer

verifed

verified

All of the following statements about MapReduce are true EXCEPT


A) MapReduce is a general-purpose execution engine.
B) MapReduce handles the complexities of network communication.
C) MapReduce handles parallel programming.
D) MapReduce runs without fault tolerance.

Correct Answer

verifed

verified

A job ________ is a node in a Hadoop cluster that initiates and coordinates MapReduce jobs,or the processing of the data.

Correct Answer

verifed

verified

Big Data uses commodity hardware,which is expensive,specialized hardware that is custom built for a client or application.

Correct Answer

verifed

verified

In the opening vignette,what is the source of the Big Data collected at the European Organization for Nuclear Research or CERN?

Correct Answer

verifed

verified

Forty million times per second,particles...

View Answer

In the Dublin City Council case study,GPS data from the city's buses and CCTV were the only data sources for the Big Data GIS-based application.

Correct Answer

verifed

verified

In a Hadoop "stack," what is a slave node?


A) a node where bits of programs are stored
B) a node where metadata is stored and used to organize data processing
C) a node where data is stored and processed
D) a node responsible for holding all the source programs

Correct Answer

verifed

verified

Hadoop is primarily a(n)________ file system and lacks capabilities we'd associate with a DBMS,such as indexing,random access to data,and support for SQL.

Correct Answer

verifed

verified

Despite their potential,many current NoSQL tools lack mature management and monitoring tools.

Correct Answer

verifed

verified

In most cases,Hadoop is used to replace data warehouses.

Correct Answer

verifed

verified

When considering Big Data projects and architecture,list and describe five challenges designers should be mindful of in order to make the journey to analytics competency less stressful.

Correct Answer

verifed

verified

.Data volume: The ability to capt...

View Answer

A newly popular unit of data in the Big Data era is the petabyte (PB) ,which is


A) 10⁹ bytes.
B) 10¹² bytes.
C) 10¹⁵ bytes.
D) 10¹⁸ bytes.

Correct Answer

verifed

verified

In the Discovery Health insurance case study,the analytics application used available data to help the company do all of the following EXCEPT


A) predict customer health.
B) detect fraud.
C) lower costs for members.
D) open its own pharmacy.

Correct Answer

verifed

verified

There is a current undersupply of data scientists for the Big Data market.

Correct Answer

verifed

verified

Why are some portions of tape backup workloads being redirected to Hadoop clusters today?

Correct Answer

verifed

verified

.First,while it may appear inexpensive t...

View Answer

Traditional data warehouses have not been able to keep up with


A) the evolution of the SQL language.
B) the variety and complexity of data.
C) expert systems that run on them.
D) OLAP.

Correct Answer

verifed

verified

As the size and the complexity of analytical systems increase,the need for more ________ analytical systems is also increasing to obtain the best performance.

Correct Answer

verifed

verified

Showing 41 - 60 of 70

Related Exams

Show Answer