Skip to main content

Execute Workitem

A job that executes a specified Activity, using specified input files and generating appropriate output files.

The relationship between an Activity and WorkItem can be thought of as a “function definition” and “function call”, respectively. The Activity specifies the AppBundle(s) to use, which in turn specify the Engine to use. The Workitem is then called to execute those.

In this tutorial sample, the workitem specifies the input file URL, the input JSON data with the new parameter values, and the destination URL for the output file. This sample will upload the input file to a OSS bucket before starting the workitem.

The following apis should be added to the DesignAutomation.js file before the last line module.exports = router;

  • StartWorkitem

This is where we actually start the Design Automation process. This endpoint also uploads the input file to an OSS Bucket and define that the output should be saved at the same bucket. To help you identify the files, both input and output uses the same original file name, but with a suffix (input or output) plus a time stamp.

routes/DesignAutomation.js
/// <summary>
/// Direct To S3
/// ref : https://aps.autodesk.com/blog/new-feature-support-direct-s3-migration-inputoutput-files-design-automation
/// </summary>

const getObjectId = async (bucketKey, objectKey, req) => {
try {
let contentStream = _fs.createReadStream(req.file.path);

//uploadResources takes an Object or Object array of resource to uplaod with their parameters,
//we are just passing only one object.
let uploadResponse = await new ForgeAPI.ObjectsApi().uploadResources(
bucketKey,
[
//object
{
objectKey: objectKey,
data: contentStream,
length: req.file.size,
},
],
{
useAcceleration: false, //Whether or not to generate an accelerated signed URL
minutesExpiration: 20, //The custom expiration time within the 1 to 60 minutes range, if not specified, default is 2 minutes
onUploadProgress: (data) => console.warn(data), // function (progressEvent) => {}
},
req.oauth_client,
req.oauth_token
);
//lets check for the first and only entry.
if (uploadResponse[0].hasOwnProperty("error") && uploadResponse[0].error) {
throw new Error(uploadResponse[0].completed.reason);
}
console.log(uploadResponse[0].completed.objectId);
return uploadResponse[0].completed.objectId;
} catch (ex) {
console.error("Failed to create ObjectID\n", ex);
throw ex;
}
};

/// <summary>
/// Start a new workitem
/// </summary>
router.post('/aps/designautomation/workitems', multer({
dest: 'uploads/'
}).single('inputFile'), async /*StartWorkitem*/(req, res) => {
const input = req.body;

// basic input validation
const workItemData = JSON.parse(input.data);
const widthParam = parseFloat(workItemData.width);
const heigthParam = parseFloat(workItemData.height);
const activityName = `${Utils.NickName}.${workItemData.activityName}`;
const browserConnectionId = workItemData.browserConnectionId;

// save the file on the server
const ContentRootPath = _path.resolve(_path.join(__dirname, '../..'));
const fileSavePath = _path.join(ContentRootPath, _path.basename(req.file.originalname));

// upload file to OSS Bucket
// 1. ensure bucket existis
const bucketKey = Utils.NickName.toLowerCase() + '-designautomation';
try {
let payload = new ForgeAPI.PostBucketsPayload();
payload.bucketKey = bucketKey;
payload.policyKey = 'transient'; // expires in 24h
await new ForgeAPI.BucketsApi().createBucket(payload, {}, req.oauth_client, req.oauth_token);
} catch (ex) {
// in case bucket already exists
}
// 2. upload inputFile
const inputFileNameOSS = `${new Date().toISOString().replace(/[-T:\.Z]/gm, '').substring(0, 14)}_input_${_path.basename(req.file.originalname)}`; // avoid overriding
// prepare workitem arguments
const bearerToken = ["Bearer", req.oauth_token.access_token].join(" ");
// 1. input file
const inputFileArgument = {
url: await getObjectId(bucketKey, inputFileNameOSS, req),
headers: { "Authorization": bearerToken }
};
// 2. input json
const inputJson = {
width: widthParam,
height: heigthParam
};
const inputJsonArgument = {
url: "data:application/json, " + JSON.stringify(inputJson).replace(/"/g, "'")
};
// 3. output file
const outputFileNameOSS = `${new Date().toISOString().replace(/[-T:\.Z]/gm, '').substring(0, 14)}_output_${_path.basename(req.file.originalname)}`; // avoid overriding
const outputFileArgument = {
url: await getObjectId(bucketKey, outputFileNameOSS, req),
verb: dav3.Verb.put,
headers: { "Authorization": bearerToken }
};

// prepare & submit workitem
const workItemSpec = {
activityId: activityName,
arguments: {
inputFile: inputFileArgument,
inputJson: inputJsonArgument,
outputFile: outputFileArgument,
}
};
let workItemStatus = null;
try {
const api = await Utils.dav3API(req.oauth_token);
workItemStatus = await api.createWorkItem(workItemSpec);
monitorWorkItem(req.oauth_client, req.oauth_token, workItemStatus.id, browserConnectionId, outputFileNameOSS, inputFileNameOSS);
} catch (ex) {
console.error(ex);
return (res.status(500).json({
diagnostic: 'Failed to create a workitem'
}));
}
res.status(200).json({
workItemId: workItemStatus.id
});
});
  • MonitorWorkItem
/// <summary>
/// Monitor work item progress
/// </summary>
async function monitorWorkItem(oauthClient, oauthToken, workItemId, browserConnectionId, outputFileName, inputFileName) {
const socketIO = require('../server').io;
try {
while (true) {
await new Promise((resolve) => setTimeout(resolve, 2000));
const api = await Utils.dav3API(oauthToken);
const status = await api.getWorkitemStatus(workItemId);
const bucketKey = Utils.NickName.toLowerCase() + '-designautomation';
const objectsApi = new ForgeAPI.ObjectsApi();
socketIO.to(browserConnectionId).emit('onComplete', status);
if (status.status == 'pending' || status.status === 'inprogress') {
continue;
}
let response = await fetch(status.reportUrl);
socketIO.to(browserConnectionId).emit('onComplete', await response.text());
if (status.status === 'success') {
response = await objectsApi.getS3DownloadURL(bucketKey, outputFileName, {
useAcceleration: false, minutesExpiration: 15
}, oauthClient, oauthToken);
socketIO.to(browserConnectionId).emit('downloadResult', response.body.url);
} else {
throw new Error('Work item failed...');
}
await objectsApi.deleteObject(bucketKey, inputFileName, oauthClient, oauthToken);
return;
}
} catch (err) {
console.error(err);
socketIO.to(browserConnectionId).emit('onError', err);
}
}
note

In the case of a real-world application you would rely on the callback mechanism of Design Automation instead of using polling to know when the work item finished

  • ClearAccount
/// <summary>
/// Clear the accounts (for debugging purpouses)
/// </summary>
router.delete(
"/aps/designautomation/account",
async (/*ClearAccount*/ req, res) => {
let api = await Utils.dav3API(req.oauth_token);
// clear account
await api.deleteForgeApp("me");
res.status(200).end();
}
);

Everything ready!

Run & Debug

Now that your app is ready, it's time to run it. This is where we can test and check for possible errors (via debugging).

Using the sample

At the top-roght, click on Configure to define AppBundle & Activity. This only need to be done once. Specify the new width and height on the left panel, select the input file and click on Start workitem. The right panel should show the results.

You can find sample files here.

caution

If the plugin code changes, then you need to upload a new AppBundle and increase the version (e.g. v1 to v2). This sample will create a new version every time a new AppBundle is uploaded.

Both input and output files are saved in OSS Buckets, you can use View Models tutorial to view them.

Troubleshooting

1. The results panel doesn't show the entire information

Make sure the ngrok is running and have not expired. Make sure the ngrok address is correctly specified at the environment variable.

2. Workitem execute, but result is not as expected

Consider use the Clear Account button. This removes all AppBundles & Activities on your account. Then define them again.

3. Cannot see my AppBundle at the Configuration form

The ZIP bundles are copied to the bundles after you Build the respective plugin. Make sure the Post-build event is properly defined and executed after build.

4. Ensuring the correct DLL was uploaded

A easy trick to ensure the correct DLL was uploaded to Design Automation is to check it's date. This StackOverflow answer shows how to get the Linker Date (i.e. when the DLL was compiled), with that you can show it on the begining of your code. Note the dates are on the server timezone.

Plugin is written in C# irrespective of server language.

LogTrace("DLL {0} compiled on {1}",
System.IO.Path.GetFileName(System.Reflection.Assembly.GetExecutingAssembly().Location),
GetLinkerTime(System.Reflection.Assembly.GetExecutingAssembly()));

Ready? Let's run it!

Go to the Debug menu and select Start debugging. The "Debug Console" tab should appear on the bottom, as shown below

Open your browser and go to http://localhost:8080.