[go: nahoru, domu]

Skip to content

Latest commit

 

History

History

rest

Google Cloud Composer Python Samples

https://gstatic.com/cloudssh/images/open-btn.png

This directory contains samples for Google Cloud Composer. Google Cloud Composer is a managed Apache Airflow service that helps you create, schedule, monitor and manage workflows. Cloud Composer automation helps you create Airflow environments quickly and use Airflow-native tools, such as the powerful Airflow web interface and command line tools, so you can focus on your workflows and not your infrastructure.

Setup

Authentication

This sample requires you to have authentication setup. Refer to the Authentication Getting Started Guide for instructions on setting up credentials for applications.

Install Dependencies

  1. Clone python-docs-samples and change directory to the sample directory you want to use.

    $ git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
  2. Install pip and virtualenv if you do not already have them. You may want to refer to the Python Development Environment Setup Guide for Google Cloud Platform for instructions.

  3. Create a virtualenv. Samples are compatible with Python 2.7 and 3.4+.

    $ virtualenv env
    $ source env/bin/activate
  4. Install the dependencies needed to run the samples.

    $ pip install -r requirements.txt

Samples

Determine client ID associated with a Cloud Composer environment

To run this sample:

$ python get_client_id.py
usage: get_client_id.py [-h] project_id location composer_environment

Get the client ID associated with a Cloud Composer environment.

positional arguments:
  project_id            Your Project ID.
  location              Region of the Cloud Composer environment.
  composer_environment  Name of the Cloud Composer environment.

optional arguments:
  -h, --help            show this help message and exit

In case of any issue with get_client_id.py, you can also get the client ID using the following command, where AIRFLOW_URL would be the URL of your Airflow webserver (ex. `https://*****************-tp.appspot.com`_):

$ curl -v AIRFLOW_URL 2>&1 >/dev/null | grep -o "client_id\=[A-Za-z0-9-]*\.apps\.googleusercontent\.com"

Determine Cloud Storage path for DAGs

https://gstatic.com/cloudssh/images/open-btn.png

To run this sample:

$ python get_dag_prefix.py

usage: get_dag_prefix.py [-h] project_id location composer_environment

Get a Cloud Composer environment via the REST API.

This code sample gets a Cloud Composer environment resource and prints the
Cloud Storage path used to store Apache Airflow DAGs.

positional arguments:
  project_id            Your Project ID.
  location              Region of the Cloud Composer environment.
  composer_environment  Name of the Cloud Composer environment.

optional arguments:
  -h, --help            show this help message and exit