community.aws.data_pipeline module – Create and manage AWS Datapipelines
Note
This module is part of the community.aws collection (version 8.0.0).
You might already have this collection installed if you are using the ansible
package.
It is not included in ansible-core
.
To check whether it is installed, run ansible-galaxy collection list
.
To install it, use: ansible-galaxy collection install community.aws
.
You need further requirements to be able to use this module,
see Requirements for details.
To use it in a playbook, specify: community.aws.data_pipeline
.
New in community.aws 1.0.0
Synopsis
Create and manage AWS Datapipelines. Creation is not idempotent in AWS, so the
uniqueId
is created by hashing the options (minus objects) given to the datapipeline.The pipeline definition must be in the format given here https://docs.aws.amazon.com/datapipeline/latest/APIReference/API_PutPipelineDefinition.html#API_PutPipelineDefinition_RequestSyntax.
Operations will wait for a configurable amount of time to ensure the pipeline is in the requested state.
Requirements
The below requirements are needed on the host that executes this module.
python >= 3.6
boto3 >= 1.26.0
botocore >= 1.29.0
Parameters
Parameter |
Comments |
---|---|
AWS access key ID. See the AWS documentation for more information about access tokens https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys. The The aws_access_key and profile options are mutually exclusive. The aws_access_key_id alias was added in release 5.1.0 for consistency with the AWS botocore SDK. The ec2_access_key alias has been deprecated and will be removed in a release after 2024-12-01. Support for the |
|
The location of a CA Bundle to use when validating SSL certificates. The |
|
A dictionary to modify the botocore configuration. Parameters can be found in the AWS documentation https://botocore.amazonaws.com/v1/documentation/api/latest/reference/config.html#botocore.config.Config. |
|
Use a The Choices:
|
|
An optional description for the pipeline being created. Default: |
|
URL to connect to instead of the default AWS endpoints. While this can be used to connection to other AWS-compatible services the amazon.aws and community.aws collections are only tested against AWS. The The ec2_url and s3_url aliases have been deprecated and will be removed in a release after 2024-12-01. Support for the |
|
The name of the Datapipeline to create/modify/delete. |
|
A list of pipeline object definitions, each of which is a dict that takes the keys id, name and fields. Default: |
|
Key-value pairs that define the properties of the object. The value is specified as a reference to another object refValue or as a string value stringValue but not as both. |
|
The field identifier. |
|
The field value, expressed as the identifier of another object. Exactly one of stringValue and refValue may be specified. |
|
The field value. Exactly one of stringValue and refValue may be specified. |
|
The ID of the object. |
|
The name of the object. |
|
A list of parameter objects (dicts) in the pipeline definition. Default: |
|
A list of attributes (dicts) of the parameter object. |
|
The field identifier. |
|
The field value. |
|
The ID of the parameter object. |
|
A named AWS profile to use for authentication. See the AWS documentation for more information about named profiles https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html. The The profile option is mutually exclusive with the aws_access_key, aws_secret_key and security_token options. |
|
The AWS region to use. For global services such as IAM, Route53 and CloudFront, region is ignored. The See the Amazon AWS documentation for more information http://docs.aws.amazon.com/general/latest/gr/rande.html#ec2_region. The Support for the |
|
AWS secret access key. See the AWS documentation for more information about access tokens https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys. The The secret_key and profile options are mutually exclusive. The aws_secret_access_key alias was added in release 5.1.0 for consistency with the AWS botocore SDK. The ec2_secret_key alias has been deprecated and will be removed in a release after 2024-12-01. Support for the |
|
AWS STS session token for use with temporary credentials. See the AWS documentation for more information about access tokens https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys. The The security_token and profile options are mutually exclusive. Aliases aws_session_token and session_token were added in release 3.2.0, with the parameter being renamed from security_token to session_token in release 6.0.0. The security_token, aws_security_token, and access_token aliases have been deprecated and will be removed in a release after 2024-12-01. Support for the |
|
The requested state of the pipeline. Choices:
|
|
A dict of key:value pair(s) to add to the pipeline. Default: |
|
Time in seconds to wait for the pipeline to transition to the requested state, fail otherwise. Default: |
|
When set to Setting validate_certs=false is strongly discouraged, as an alternative, consider setting aws_ca_bundle instead. Choices:
|
|
A list of parameter values (dicts) in the pipeline definition. Default: |
|
The ID of the parameter value |
|
The field value |
Notes
Note
Caution: For modules, environment variables and configuration files are read from the Ansible ‘host’ context and not the ‘controller’ context. As such, files may need to be explicitly copied to the ‘host’. For lookup and connection plugins, environment variables and configuration files are read from the Ansible ‘controller’ context and not the ‘host’ context.
The AWS SDK (boto3) that Ansible uses may also read defaults for credentials and other settings, such as the region, from its configuration files in the Ansible ‘host’ context (typically
~/.aws/credentials
). See https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html for more information.
Examples
# Note: These examples do not set authentication details, see the AWS Guide for details.
# Create pipeline
- community.aws.data_pipeline:
name: test-dp
region: us-west-2
objects: "{{pipelineObjects}}"
parameters: "{{pipelineParameters}}"
values: "{{pipelineValues}}"
tags:
key1: val1
key2: val2
state: present
# Example populating and activating a pipeline that demonstrates two ways of providing pipeline objects
- community.aws.data_pipeline:
name: test-dp
objects:
- id: "DefaultSchedule"
name: "Every 1 day"
fields:
- "key": "period"
"stringValue": "1 days"
- "key": "type"
"stringValue": "Schedule"
- "key": "startAt"
"stringValue": "FIRST_ACTIVATION_DATE_TIME"
- id: "Default"
name: "Default"
fields:
- "key": "resourceRole"
"stringValue": "my_resource_role"
- "key": "role"
"stringValue": "DataPipelineDefaultRole"
- "key": "pipelineLogUri"
"stringValue": "s3://my_s3_log.txt"
- "key": "scheduleType"
"stringValue": "cron"
- "key": "schedule"
"refValue": "DefaultSchedule"
- "key": "failureAndRerunMode"
"stringValue": "CASCADE"
state: active
# Activate pipeline
- community.aws.data_pipeline:
name: test-dp
region: us-west-2
state: active
# Delete pipeline
- community.aws.data_pipeline:
name: test-dp
region: us-west-2
state: absent
Return Values
Common return values are documented here, the following are the fields unique to this module:
Key |
Description |
---|---|
whether the data pipeline has been modified Returned: always Sample: |
|
Contains the data pipeline data (data_pipeline) and a return message (msg). If the data pipeline exists data_pipeline will contain the keys description, name, pipeline_id, state, tags, and unique_id. If the data pipeline does not exist then data_pipeline will be an empty dict. The msg describes the status of the operation. Returned: always |