Autodesk Forge is now Autodesk Platform Services

18 Sep 2019

Download large derivatives in chunks

Download in chunks

If you had ever run into 500 or 504 errors when downloading large derivatives (usually in GBs) using the GET    :urn/manifest/:derivativeurn endpoint, that was most likely due to a limitation on processing long running GET requests in our backend.

As a workaround we can download the derivatives in chunks by specifying a “Range” option in the request header: 

Range header option

Implementing such a mechanism may sound like a tedious task, but we can take this as a opportunity to download large derivatives in chunks with multiple requests in parallel. And by dong so there’s every chance that we’d get also a significant speed boost. 

To do this we can simply fire multiple requests in parallel with the range header option to fetch data chunks - take Node.js and Axios for an example we can put our code together like below:

function fetchChunkAsync(tmpPath, rangeString, tokenString) { 

return new Promise( (resolve, reject)=>{ 

axios({

  method: 'get',

 headers: {
    'Authorization': 'Bearer ' + tokenString,
    'Range': 'bytes=' + rangeString
}
  url: 'https://developer.api.autodesk.com/modelderivative/v2/designdata/:urn/manifest/:derivativeUrn',

  responseType: 'stream'

})

  .then( response =>{

    response.data.pipe(fs.createWriteStream(tmpPath));

    response.data.on('end', ()=> resolve())

  }).catch(error => reject(error))
})
}

You can calculate the ideal chunk sizes and set up config objects with paths for temporary storages, put them into a array in the right sequence and pass to the above function to fetch the data and return a “Promise” for callback handling when done.

Finally, merge all chunks together in the final file soon as all requests and their temporary persistence wraps up:

Promise.all(chunkConfigs.map(config=>fetchChunkAsync(config.tmpPath, config.rangeString, tokenString))).then(()=>{

chunkConfigs.forEach( config=> {
  // concat chunks together and clean up
  })
})

We will consider stating this limitation explicitly in the documentation for the relevant endpoints and we may have more to update later. Thanks and until next time!

Tags:

Related Article