BatchPredictionJob class final
A job that uses a
Model to produce
predictions on multiple google.cloud.aiplatform.v1beta1.BatchPredictionJob.input_config.
If predictions for significant portion of the instances fail, the job may
finish without attempting predictions for all remaining instances.
- Inheritance
-
- Object
- ProtoMessage
- BatchPredictionJob
Constructors
-
BatchPredictionJob({String name = '', required String displayName, String model = '', String modelVersionId = '', UnmanagedContainerModel? unmanagedContainerModel, required BatchPredictionJob_InputConfig? inputConfig, BatchPredictionJob_InstanceConfig? instanceConfig, Value? modelParameters, required BatchPredictionJob_OutputConfig? outputConfig, BatchDedicatedResources? dedicatedResources, String serviceAccount = '', ManualBatchTuningParameters? manualBatchTuningParameters, bool generateExplanation = false, ExplanationSpec? explanationSpec, BatchPredictionJob_OutputInfo? outputInfo, JobState state = JobState.$default, Status? error, List<
Status> partialFailures = const [], ResourcesConsumed? resourcesConsumed, CompletionStats? completionStats, Timestamp? createTime, Timestamp? startTime, Timestamp? endTime, Timestamp? updateTime, Map<String, String> labels = const {}, EncryptionSpec? encryptionSpec, ModelMonitoringConfig? modelMonitoringConfig, List<ModelMonitoringStatsAnomalies> modelMonitoringStatsAnomalies = const [], Status? modelMonitoringStatus, bool disableContainerLogging = false, bool satisfiesPzs = false, bool satisfiesPzi = false}) - BatchPredictionJob.fromJson(Object? j)
-
factory
Properties
- completionStats → CompletionStats?
-
Output only. Statistics on completed and failed prediction instances.
final
- createTime → Timestamp?
-
Output only. Time when the BatchPredictionJob was created.
final
- dedicatedResources → BatchDedicatedResources?
-
The config of resources used by the Model during the batch prediction. If
the Model
supportsDEDICATED_RESOURCES this config may be provided (and the job will use these resources), if the Model doesn't support AUTOMATIC_RESOURCES, this config must be provided.final - disableContainerLogging → bool
-
For custom-trained Models and AutoML Tabular Models, the container of the
DeployedModel instances will send
stderrandstdoutstreams to Cloud Logging by default. Please note that the logs incur cost, which are subject to Cloud Logging pricing.final - displayName → String
-
Required. The user-defined name of this BatchPredictionJob.
final
- encryptionSpec → EncryptionSpec?
-
Customer-managed encryption key options for a BatchPredictionJob. If this
is set, then all resources created by the BatchPredictionJob will be
encrypted with the provided encryption key.
final
- endTime → Timestamp?
-
Output only. Time when the BatchPredictionJob entered any of the following
states:
JOB_STATE_SUCCEEDED,JOB_STATE_FAILED,JOB_STATE_CANCELLED.final - error → Status?
-
Output only. Only populated when the job's state is JOB_STATE_FAILED or
JOB_STATE_CANCELLED.
final
- explanationSpec → ExplanationSpec?
-
Explanation configuration for this BatchPredictionJob. Can be
specified only if
generate_explanationis set totrue.final - generateExplanation → bool
-
Generate explanation with the batch prediction results.
final
- hashCode → int
-
The hash code for this object.
no setterinherited
- inputConfig → BatchPredictionJob_InputConfig?
-
Required. Input configuration of the instances on which predictions are
performed. The schema of any single instance may be specified via the
google.cloud.aiplatform.v1beta1.BatchPredictionJob.modelgoogle.cloud.aiplatform.v1beta1.Model.predict_schematainstance_schema_uri.final - instanceConfig → BatchPredictionJob_InstanceConfig?
-
Configuration for how to convert batch prediction input instances to the
prediction instances that are sent to the Model.
final
-
labels
→ Map<
String, String> -
The labels with user-defined metadata to organize BatchPredictionJobs.
final
- manualBatchTuningParameters → ManualBatchTuningParameters?
-
Immutable. Parameters configuring the batch behavior. Currently only
applicable when
dedicated_resourcesare used (in other cases Vertex AI does the tuning itself).final - model → String
-
The name of the Model resource that produces the predictions via this job,
must share the same ancestor Location.
Starting this job has no impact on any existing deployments of the Model
and their resources.
Exactly one of model and unmanaged_container_model must be set.
final
- modelMonitoringConfig → ModelMonitoringConfig?
-
Model monitoring config will be used for analysis model behaviors, based on
the input and output to the batch prediction job, as well as the provided
training dataset.
final
-
modelMonitoringStatsAnomalies
→ List<
ModelMonitoringStatsAnomalies> -
Get batch prediction job monitoring statistics.
final
- modelMonitoringStatus → Status?
-
Output only. The running status of the model monitoring pipeline.
final
- modelParameters → Value?
-
The parameters that govern the predictions. The schema of the parameters
may be specified via the
google.cloud.aiplatform.v1beta1.BatchPredictionJob.modelgoogle.cloud.aiplatform.v1beta1.Model.predict_schemataparameters_schema_uri.final - modelVersionId → String
-
Output only. The version ID of the Model that produces the predictions via
this job.
final
- name → String
-
Output only. Resource name of the BatchPredictionJob.
final
- outputConfig → BatchPredictionJob_OutputConfig?
-
Required. The Configuration specifying where output predictions should
be written.
The schema of any single prediction may be specified as a concatenation
of
google.cloud.aiplatform.v1beta1.BatchPredictionJob.modelgoogle.cloud.aiplatform.v1beta1.Model.predict_schematainstance_schema_uriandprediction_schema_uri.final - outputInfo → BatchPredictionJob_OutputInfo?
-
Output only. Information further describing the output of this job.
final
-
partialFailures
→ List<
Status> -
Output only. Partial failures encountered.
For example, single files that can't be read.
This field never exceeds 20 entries.
Status details fields contain standard Google Cloud error details.
final
- qualifiedName → String
-
The fully qualified name of this message, i.e.,
google.protobuf.Durationorgoogle.rpc.ErrorInfo.finalinherited - resourcesConsumed → ResourcesConsumed?
-
Output only. Information about resources that had been consumed by this
job. Provided in real time at best effort basis, as well as a final value
once the job completes.
final
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
- satisfiesPzi → bool
-
Output only. Reserved for future use.
final
- satisfiesPzs → bool
-
Output only. Reserved for future use.
final
- serviceAccount → String
-
The service account that the DeployedModel's container runs as. If not
specified, a system generated one will be used, which
has minimal permissions and the custom container, if used, may not have
enough permission to access other Google Cloud resources.
final
- startTime → Timestamp?
-
Output only. Time when the BatchPredictionJob for the first time entered
the
JOB_STATE_RUNNINGstate.final - state → JobState
-
Output only. The detailed state of the job.
final
- unmanagedContainerModel → UnmanagedContainerModel?
-
Contains model information necessary to perform batch prediction without
requiring uploading to model registry.
Exactly one of model and unmanaged_container_model must be set.
final
- updateTime → Timestamp?
-
Output only. Time when the BatchPredictionJob was most recently updated.
final
Methods
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toJson(
) → Object -
override
-
toString(
) → String -
A string representation of this object.
override
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited
Constants
- fullyQualifiedName → const String