%PDF- <> %âãÏÓ endobj 2 0 obj <> endobj 3 0 obj <>/ExtGState<>/ProcSet[/PDF/Text/ImageB/ImageC/ImageI] >>/Annots[ 28 0 R 29 0 R] /MediaBox[ 0 0 595.5 842.25] /Contents 4 0 R/Group<>/Tabs/S>> endobj ºaâÚÎΞ-ÌE1ÍØÄ÷{òò2ÿ ÛÖ^ÔÀá TÎ{¦?§®¥kuµùÕ5sLOšuY>endobj 2 0 obj<>endobj 2 0 obj<>endobj 2 0 obj<>endobj 2 0 obj<> endobj 2 0 obj<>endobj 2 0 obj<>es 3 0 R>> endobj 2 0 obj<> ox[ 0.000000 0.000000 609.600000 935.600000]/Fi endobj 3 0 obj<> endobj 7 1 obj<>/ProcSet[/PDF/Text/ImageB/ImageC/ImageI]>>/Subtype/Form>> stream
**To start the recognition of unsafe content in a stored video** The following ``start-content-moderation`` command starts a job to detect unsafe content in the specified video file stored in an Amazon S3 bucket. :: aws rekognition start-content-moderation \ --video "S3Object={Bucket=MyVideoS3Bucket,Name=MyVideoFile.mpg}" Output:: { "JobId": "1234567890abcdef1234567890abcdef1234567890abcdef1234567890abcdef" } For more information, see `Detecting Unsafe Stored Videos <https://docs.aws.amazon.com/rekognition/latest/dg/procedure-moderate-videos.html>`__ in the *Amazon Rekognition Developer Guide*.