class Google::Apis::VisionV1p2beta1::GoogleCloudVisionV1p4beta1SafeSearchAnnotation
Set of features pertaining to the image, computed by computer vision methods over safe-search verticals (for example, adult, spoof, medical, violence).
Attributes
Represents the adult content likelihood for the image. Adult content may contain elements such as nudity, pornographic images or cartoons, or sexual activities. Corresponds to the JSON property `adult` @return [String]
Likelihood that this is a medical image. Corresponds to the JSON property `medical` @return [String]
Likelihood that the request image contains racy content. Racy content may include (but is not limited to) skimpy or sheer clothing, strategically covered nudity, lewd or provocative poses, or close-ups of sensitive body areas. Corresponds to the JSON property `racy` @return [String]
Spoof likelihood. The likelihood that an modification was made to the image's canonical version to make it appear funny or offensive. Corresponds to the JSON property `spoof` @return [String]
Likelihood that this image contains violent content. Corresponds to the JSON property `violence` @return [String]
Public Class Methods
# File lib/google/apis/vision_v1p2beta1/classes.rb, line 8253 def initialize(**args) update!(**args) end
Public Instance Methods
Update properties of this object
# File lib/google/apis/vision_v1p2beta1/classes.rb, line 8258 def update!(**args) @adult = args[:adult] if args.key?(:adult) @medical = args[:medical] if args.key?(:medical) @racy = args[:racy] if args.key?(:racy) @spoof = args[:spoof] if args.key?(:spoof) @violence = args[:violence] if args.key?(:violence) end