friendsliner.blogg.se

Airflow dag logging
Airflow dag logging













airflow dag logging

See the dataset for all available country codes. (Optional) Choose which country's data to analyze by specifying your desired country_code in the dbt_args parameter of the DbtTaskGroup. Additionally, the provider can infer dependencies within the dbt project and will set your Airflow task dependencies accordingly. The DbtTaskGroup function of the Astro dbt provider package automatically scans the dbt folder for dbt projects and creates a task group ( transform_data in this example) containing Airflow tasks for running and testing your dbt models. The log_data_analysis task uses the Astro Python SDK dataframe operator to run an analysis on the final table using pandas and logs the results.Using the models defined in Step 4, the task group will contain two nested task groups with two tasks each, one for dbt run, the other for dbt test. The transform_data task group is created from the dbt models.The load_file task uses the Astro Python SDK load file operator to load the contents of the local CSV file into the data warehouse.This DAG consists of two tasks and one task group: # the dataset mostly contains data from EU countries

airflow dag logging

"dbt_executable_path" : DBT_EXECUTABLE_PATH , # use the DbtTaskGroup class to create a task group containing task created # the table named 'energy' will be created by this task if it does not exist yet # Astro SDK task that loads information from the local csv into a relational database If latest_year = year_with_the_highest_solar_pct :į"Yay! In adoption of renewable energy is ( sort_values ( by = "YEAR", ascending = True ) Year_with_the_highest_renewables_pct = df. If the latest year in the data was also the year with the highest % of solarĬapacity and/or the year with the highest % of renewables capacity aĬelebratory message is logged as weel.""" To log a table of % Solar and % renewable energy capacity per year. """Analyzes the energy capacity information from the input table in order # in the virtual environment created in the DockerfileĭBT_EXECUTABLE_PATH = "/usr/local/airflow/dbt_venv/bin/dbt"ĭef log_data_analysis ( df : pd. # the path where the Astro dbt provider will find the dbt executable task_group import DbtTaskGroupįrom astro.















Airflow dag logging