google.cloud.gcp_mlengine_model module – Creates a GCP Model
Note
This module is part of the google.cloud collection (version 1.3.0).
You might already have this collection installed if you are using the ansible
package.
It is not included in ansible-core
.
To check whether it is installed, run ansible-galaxy collection list
.
To install it, use: ansible-galaxy collection install google.cloud
.
You need further requirements to be able to use this module,
see Requirements for details.
To use it in a playbook, specify: google.cloud.gcp_mlengine_model
.
Synopsis
Represents a machine learning solution.
A model can have multiple versions, each of which is a deployed, trained model ready to receive prediction requests. The model itself is just a container.
Requirements
The below requirements are needed on the host that executes this module.
python >= 2.6
requests >= 2.18.4
google-auth >= 1.3.0
Parameters
Parameter |
Comments |
---|---|
An OAuth2 access token if credential type is accesstoken. |
|
The type of credential used. Choices:
|
|
The default version of the model. This version will be used to handle prediction requests that do not specify a version. |
|
The name specified for the version when it was created. |
|
The description specified for the model when it was created. |
|
Specifies which Ansible environment you’re running this module within. This should not be set unless you know what you’re doing. This only alters the User Agent string for any API requests. |
|
One or more labels that you can add, to organize your models. |
|
The name specified for the model. |
|
If true, online prediction nodes send stderr and stdout streams to Stackdriver Logging. Choices:
|
|
If true, online prediction access logs are sent to StackDriver Logging. Choices:
|
|
The Google Cloud Platform project to use. |
|
The list of regions where the model is going to be deployed. Currently only one region per model is supported . |
|
Array of scopes to be used |
|
The contents of a Service Account JSON file, either in a dictionary or as a JSON string that represents it. |
|
An optional service account email address if machineaccount is selected and the user does not wish to use the default email. |
|
The path of a Service Account JSON file if serviceaccount is selected as type. |
|
Whether the given object should exist in GCP Choices:
|
Notes
Note
API Reference: https://cloud.google.com/ai-platform/prediction/docs/reference/rest/v1/projects.models
Official Documentation: https://cloud.google.com/ai-platform/prediction/docs/deploying-models
for authentication, you can set service_account_file using the
GCP_SERVICE_ACCOUNT_FILE
env variable.for authentication, you can set service_account_contents using the
GCP_SERVICE_ACCOUNT_CONTENTS
env variable.For authentication, you can set service_account_email using the
GCP_SERVICE_ACCOUNT_EMAIL
env variable.For authentication, you can set access_token using the
GCP_ACCESS_TOKEN
env variable.For authentication, you can set auth_kind using the
GCP_AUTH_KIND
env variable.For authentication, you can set scopes using the
GCP_SCOPES
env variable.Environment variables values will only be used if the playbook values are not set.
The service_account_email and service_account_file options are mutually exclusive.
Examples
- name: create a model
google.cloud.gcp_mlengine_model:
name: "{{ resource_name | replace('-', '_') }}"
description: My model
regions:
- us-central1
project: test_project
auth_kind: serviceaccount
service_account_file: "/tmp/auth.pem"
state: present
Return Values
Common return values are documented here, the following are the fields unique to this module:
Key |
Description |
---|---|
The default version of the model. This version will be used to handle prediction requests that do not specify a version. Returned: success |
|
The name specified for the version when it was created. Returned: success |
|
The description specified for the model when it was created. Returned: success |
|
One or more labels that you can add, to organize your models. Returned: success |
|
The name specified for the model. Returned: success |
|
If true, online prediction nodes send stderr and stdout streams to Stackdriver Logging. Returned: success |
|
If true, online prediction access logs are sent to StackDriver Logging. Returned: success |
|
The list of regions where the model is going to be deployed. Currently only one region per model is supported . Returned: success |