List of Pages in Category ETL (85 pages)
extract, transform, load (ETL)
A process that involves extracting data from sources, transforming the data to fit operational needs, and loading the data into the end target, typically a database or data warehouse.
*
A
C
- Cloudera Manager and Managed Service Datastores
- Cloudera Search Tasks and Processes
- COMPUTE STATS Statement
- Configuring an External Database for Sqoop
- Configuring Encrypted On-disk File Channels for Flume
- Configuring Flume Solr Sink to Sip from the Twitter Firehose
- Configuring Flume's Security Properties
- Configuring Kerberos for Flume Thrift Source and Sink Using Cloudera Manager
- Configuring Kerberos for Flume Thrift Source and Sink Using the Command Line
- Configuring Sqoop 2
- Configuring the Flume Solr Sink
- Configuring TLS/SSL for Flume Thrift Source and Sink
- Configuring TLS/SSL for Solr
- Copying Data between a Secure and an Insecure Cluster using DistCp and WebHDFS
- Copying Data Between Two Clusters Using Distcp
D
E
- Enabling Hue Applications Using Cloudera Manager
- Extracting, Transforming, and Loading Data With Cloudera Morphlines
F
- Feature Differences - Sqoop 1 and Sqoop 2
- Files Installed by the Flume RPM and Debian Packages
- Flume Account Requirements
- Flume Authentication
- Flume Configuration
- Flume Installation
- Flume Morphline Interceptor Configuration Options
- Flume Morphline Solr Sink Configuration Options
- Flume Near Real-Time Indexing Reference
- Flume Packaging
- Flume Solr BlobDeserializer Configuration Options
- Flume Solr BlobHandler Configuration Options
- Flume Solr UUIDInterceptor Configuration Options
I
- Importing Data Into HBase
- Indexing a File Containing Tweets with Flume HTTPSource
- Indexing a File Containing Tweets with Flume SpoolDirectorySource
- INSERT Statement
- Installing Sqoop 2
- Installing the Flume RPM or Debian Packages
- Installing the Flume Tarball
- Installing the JDBC Drivers for Sqoop 1
- Installing the Sqoop 1 RPM or Debian Packages
- Installing the Sqoop 1 Tarball
- Introducing Cloudera Navigator Optimizer
- INVALIDATE METADATA Statement
L
M
N
P
R
S
- Snappy Compression
- Sqoop 1 Installation
- Sqoop 1 Packaging
- Sqoop 1 Prerequisites
- Sqoop 2 Installation
- Sqoop Authentication
- Sqoop, Pig, and Whirr Security Support Status
- Starting the Flume Agent
- Starting, Stopping, and Accessing the Sqoop 2 Server
- Step 10: (Flume Only) Use Substitution Variables for the Kerberos Principal and Keytab
- Supported Sources, Sinks, and Channels
T
U
- Upgrading Flume
- Upgrading Sqoop 1 from an Earlier CDH 5 release
- Upgrading Sqoop 1 from CDH 4 to CDH 5
- Upgrading Sqoop 2 from an Earlier CDH 5 Release
- Upgrading Sqoop 2 from CDH 4 to CDH 5
- Using HDFS Caching with Impala (CDH 5.1 or higher only)
- Using Impala to Query HBase Tables
- Using Impala with the Amazon S3 Filesystem
- Using Text Data Files with Impala Tables
- Using the Avro File Format with Impala Tables
- Using the Parquet File Format with Impala Tables
- Using the RCFile File Format with Impala Tables
- Using the SequenceFile File Format with Impala Tables
V
© 2016 Cloudera, Inc. All rights reserved | ||
Terms and Conditions Privacy Policy |