Talend is a modern data management platform that offers a comprehensive suite of capabilities to handle data and drive business value efficiently.
It works with any data environment or framework and supports cloud, multi-cloud, hybrid, or on-premises environments.
Talend offers a modular solution accommodating data needs at any scale or complexity. Their key features include data integration, quality, governance, and application and API integration. Talend is flexible, scalable, and cloud-independent, supporting various architectures.
With Talend, organizations can turn their data into a trusted asset and tackle key initiatives for strategic business outcomes.
We will see the intricate yet powerful process of reading data using Talend, and we’ll navigate through key components that play a pivotal role in this process of reading data from an Excel file for effective logging and monitoring.
Talend offers Input components for the data reading, for example, TFileInputExcel, for an Excel Source. This Component lets us configure the file, define sheets, and set the mapping of the columns.
You can use diverse Excel formats, providing flexibility and manipulation for the data structures.
→ In Repository, find the metadata and then create the connection that you need for the job (ex. We have an Excel file – we use File Excel for the connection)
→ In the Repository, find Job Designs and create a new job with the desired name
→ Create the tFileInputExcel component
→ Double-click on tFileInputExcel to open its Component view
→ Set the file path, and sheet name, and configure other options based on your Excel file structure
→ Define the schema by clicking on the “Edit Schema” button. This step is crucial to define the structure of the data that will be read
→ Create the tMap component
→ Connect the output of tFileInputExcel to the input of tMap
→ Double-click on tMap to open its Mapping Editor
→ In the Mapping Editor, you’ll see input and output tables
→ Map the columns from the input (tFileInputExcel) to the desired columns in the output
→ Apply any transformations or calculations if needed
→ Define the output schema by clicking on the “Edit Schema” button
→ Create the tLogRow component
→ Connect the output of tMap to the input of tLogRow
→ Configure tLogRow
→ Double-click on tLogRow to open its Component view
→ Select the desired schema and then the mode you need when you read the data
→ Save your job
→ Click on the Run button to execute the job
→ Open the console or Run view to see the output logged by tLogRow
Here is the result in Talend:
Watch the video tutorial here:
About btProvider:
We hold all possible technical and sales certifications for all products:
Talend, Tableau Desktop, Tableau Public, Tableau Server, Tableau Prep, Tableau Data Management, Vertica, Salesforce, Mulesoft, Write-Back.
With Talend – the best data management platform – you can always understand your data better. Using Talend, you will make intelligent and strategic decisions and seamlessly integrate, assure quality, and govern your data effortlessly. See other #skillpill video tutorials here:
Using measure names in parameter actions
Are you interested in learning more about Talend – the catalyst for efficient and comprehensive data management solutions across your company.
In Talend Cloud Management Console (TMC), you can create executable tasks and execution plans, which can be run directly in the Cloud or with a Remote Engine. You need to create an execution plan to allow automated runs in TMC, […]
In this Skillpill, you will learn how to use Tableau Prep to organize, clean, and transform your data to be further used in analysis and visualizations. Data comes in all shapes and sizes, especially of different quality, which sometimes proves […]
On April 18 we organized a private event btProvider & Vertica by OpenText titled “AI and Analytics”.
It was a private event dedicated to Data enthusiasts eager to explore the innovations of Analytics and Artificial Intelligence.