7 Nov 2022
New feature to support Direct-to-S3 migration for input/output files of Design Automation
If you are working with Design Automation, and the input/output files are from Autodesk OSS(Object Storage Service) or Autodesk Docs, please READ this blog before migrating your workflow to Direct-to-S3 approach as we announced here .
Design Automation now improved the way to support the input/output url with the format like "urn:adsk.objects:os.object:<bucket_name>/<object_key>", developers are not required to generate the signed S3 url, what developer needs to prepare is just preparing the objectId and the access token, everything else will be automatically handled by Design Automation, and this new improvement is supported by all the Design Automation engines including Revit, 3ds Max, Inventor and AutoCAD.
Here is the details to get or prepare the objectId:
For input file from OSS:
- Find the objectId from the bucket, the GET Objects can be used to retrieve the object info.
For output file from OSS:
- You can create a storage and get the objectId, but creating a storage is not a must, you can create the objectId directly by the format "urn:adsk.objects:os.object:<BucketKey>/<ObjectKey>", refer design.automation-nodejs-revit.parameters.excel as an example.
For input file from Autodesk Docs:
- Get the file version by GET Version, and you can find the objectId at response.data.relationships.storage.data.id, check example here.
For output file from Autodesk Docs:
- Please refer the tutorial about the whole process to upload a new file or add a new file version, and with step 3, use POST projects/:project_id/storage endpoint to create a storage location in the OSS where files can be uploaded to, you can get the objectId by response.data.id, check here as an example.
When you got or prepared the objectId, you can pass it to the workitem url together with the access token, the access token needs to have correct permision to the storage, for example, input file objectId should include data:read scope, output file object id should include data:write scope, here is an example of workitem payload:
{
"activityId": "mynickname.myactivity+prod",
"arguments": {
"rvtFile": {
"url": "urn:adsk.objects:os.object:revit_da_integration_tests/CountIt.rvt",
"verb": "get",
"headers": {
"Authorization": "Bearer <access_token>"
}
},
"result": {
"verb": "put",
"url": "urn:adsk.objects:os.object:revit_da_integration_tests/result.txt",
"headers": {
"Authorization": "Bearer <access_token>"
}
}
}
}
With this new feature, Design Automation will automatically handle some tasks at backend, here are some hints:
- Design Automation will create the S3 signed url by the objectId and access token, then do download/upload internally.
- Design Automation will extend the provided valid access token if it is expired. No worry about the expired signed url after 1 hour anymore.
- Multi-part upload will be supported by default for big file uploading to OSS.
- The OSS buckets and ACC/BIM360 buckets both work, you need to provide the correct token accordingly, such as providing 3-legged token for buckets owned by ACC/BIM360.
Please check my sample design.automation-nodejs-revit.parameters.excel for all the implementation if you are interested, check the Prepare Cloud Storage(use Revit as example) for preparing storage workflow.
Enjoy the new feature, and please send email to aps.help@autodesk.com if any questions or need help.