PNG  IHDR;IDATxܻn0K )(pA 7LeG{ §㻢|ذaÆ 6lذaÆ 6lذaÆ 6lom$^yذag5bÆ 6lذaÆ 6lذa{ 6lذaÆ `}HFkm,mӪôô! x|'ܢ˟;E:9&ᶒ}{v]n&6 h_tڠ͵-ҫZ;Z$.Pkž)!o>}leQfJTu іچ\X=8Rن4`Vwl>nG^is"ms$ui?wbs[m6K4O.4%/bC%t Mז -lG6mrz2s%9s@-k9=)kB5\+͂Zsٲ Rn~GRC wIcIn7jJhۛNCS|j08yiHKֶۛkɈ+;SzL/F*\Ԕ#"5m2[S=gnaPeғL lذaÆ 6l^ḵaÆ 6lذaÆ 6lذa; _ذaÆ 6lذaÆ 6lذaÆ RIENDB` # Generated by default/object.tt package Paws::Personalize::BatchInferenceJob; use Moose; has BatchInferenceJobArn => (is => 'ro', isa => 'Str', request_name => 'batchInferenceJobArn', traits => ['NameInRequest']); has BatchInferenceJobConfig => (is => 'ro', isa => 'Paws::Personalize::BatchInferenceJobConfig', request_name => 'batchInferenceJobConfig', traits => ['NameInRequest']); has CreationDateTime => (is => 'ro', isa => 'Str', request_name => 'creationDateTime', traits => ['NameInRequest']); has FailureReason => (is => 'ro', isa => 'Str', request_name => 'failureReason', traits => ['NameInRequest']); has FilterArn => (is => 'ro', isa => 'Str', request_name => 'filterArn', traits => ['NameInRequest']); has JobInput => (is => 'ro', isa => 'Paws::Personalize::BatchInferenceJobInput', request_name => 'jobInput', traits => ['NameInRequest']); has JobName => (is => 'ro', isa => 'Str', request_name => 'jobName', traits => ['NameInRequest']); has JobOutput => (is => 'ro', isa => 'Paws::Personalize::BatchInferenceJobOutput', request_name => 'jobOutput', traits => ['NameInRequest']); has LastUpdatedDateTime => (is => 'ro', isa => 'Str', request_name => 'lastUpdatedDateTime', traits => ['NameInRequest']); has NumResults => (is => 'ro', isa => 'Int', request_name => 'numResults', traits => ['NameInRequest']); has RoleArn => (is => 'ro', isa => 'Str', request_name => 'roleArn', traits => ['NameInRequest']); has SolutionVersionArn => (is => 'ro', isa => 'Str', request_name => 'solutionVersionArn', traits => ['NameInRequest']); has Status => (is => 'ro', isa => 'Str', request_name => 'status', traits => ['NameInRequest']); 1; ### main pod documentation begin ### =head1 NAME Paws::Personalize::BatchInferenceJob =head1 USAGE This class represents one of two things: =head3 Arguments in a call to a service Use the attributes of this class as arguments to methods. You shouldn't make instances of this class. Each attribute should be used as a named argument in the calls that expect this type of object. As an example, if Att1 is expected to be a Paws::Personalize::BatchInferenceJob object: $service_obj->Method(Att1 => { BatchInferenceJobArn => $value, ..., Status => $value }); =head3 Results returned from an API call Use accessors for each attribute. If Att1 is expected to be an Paws::Personalize::BatchInferenceJob object: $result = $service_obj->Method(...); $result->Att1->BatchInferenceJobArn =head1 DESCRIPTION Contains information on a batch inference job. =head1 ATTRIBUTES =head2 BatchInferenceJobArn => Str The Amazon Resource Name (ARN) of the batch inference job. =head2 BatchInferenceJobConfig => L A string to string map of the configuration details of a batch inference job. =head2 CreationDateTime => Str The time at which the batch inference job was created. =head2 FailureReason => Str If the batch inference job failed, the reason for the failure. =head2 FilterArn => Str The ARN of the filter used on the batch inference job. =head2 JobInput => L The Amazon S3 path that leads to the input data used to generate the batch inference job. =head2 JobName => Str The name of the batch inference job. =head2 JobOutput => L The Amazon S3 bucket that contains the output data generated by the batch inference job. =head2 LastUpdatedDateTime => Str The time at which the batch inference job was last updated. =head2 NumResults => Int The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records. =head2 RoleArn => Str The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job. =head2 SolutionVersionArn => Str The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created. =head2 Status => Str The status of the batch inference job. The status is one of the following values: =over =item * PENDING =item * IN PROGRESS =item * ACTIVE =item * CREATE FAILED =back =head1 SEE ALSO This class forms part of L, describing an object used in L =head1 BUGS and CONTRIBUTIONS The source code is located here: L Please report bugs to: L =cut