Clean your project before publishing!

Just a quick note to say that I managed to get my 6.6 MB WebAR project down to 1.15 MB - woohoo!

By far, the biggest savings came from the new command line utility for cleaning projects, which dropped it to 1.7 MB. I had already been following the optimization guidelines, so images, textures, and geometry were in good shape. The rest of the savings came from ditching a font and reducing the size of the tracked image file. (It seems there’s no real benefit to training an image that’s more than about 2K pixels in its largest dimension, regardless of how large the tracked image is in real life.)

FYI, for whatever reason, Studio seems to grossly underestimate the size of published projects, so the project should be exported to determine it’s actual size.

Ideally, the “cleaning” process should be handled automatically “behind the scenes” when a project is published, but at least now I can provide a better experience for the end user (which is especially important with mobile WebAR.)

This is my second WebAR project, and I do have feedback related to WebAR, specifically, but I’ll save that for another post. :slightly_smiling_face:

3 Likes

The published size of the project shouldn’t be affected by the cleaning tool - that size should be shown after publish or preview (it’s called approx as it doesn’t include the Javascript, but should include all the assets that are bundled). If you’ve definitely seen that change after running the cleaning tool then that would be interesting to hear.

Image training embeds the original image in the zpt for the Studio preview, but that gets stripped out on publish so shouldn’t affect published size. You’re right though that there’s no practical benefit to using large source images for training, the training process resizes them down to around 500 pixels max at the moment.

There are internal implementation detail reasons why it was easier to implement the cleaning tool into the ZapWorks cli rather than Studio, but we agree in an ideal world this would be part of the Studio export process.

If the ZPP file that can be downloaded from my account’s publish history is the same file that gets downloaded by / delivered to the user during the initial “unlocking” phase, then yes, the size of that file can definitely be affected by the cleaning tool - and hugely so.

In fact, that’s what @eric.craft stated in the initial thread which sparked this topic.

Not sure if I understand what you’re saying here, but a larger training image definitely results in a larger ZPP file.

Users download a zip file with only the runtime assets required. It’s different from the zpp. The size that users downloaded is shown in the “project published” window in Studio, next to the zapcode.

The larger training image increases the size of the zpp (as it includes a larger preview image, used when you visualise the TargetFinder in Studio). This won’t be included in the asset zip that users will download (we strip the preview image from the zpt file when publishing).

@simon, What at about the published zpp that is available from the project page? Is that representative of the zip file that users download? These files can be massive and use up unneeded bandwidth of the developers if that isn’t what is being delivered to the end user.

EDIT: How do we see what is being delivered to the end user without having to dig through app folders on a mobile device?

-Eric

Now THAT’S really good to know. So basically, the end-user experience was NOT being compromised, and the assertion made in this post is incorrect?

Thanks for clearing that up!

Yup the zpp contains more than is in the user download - it’s the full project file so you can open and edit in Studio. We thought it would be useful for people not just to be able to revert to a previous publish but to download the full editable project file reflecting each publish (we certainly find that useful internally).

The size shown in the “project published” window should be the size of the asset zip. The easiest way to verify what’s actually downloaded is to use the network tab of a web inspector on https://web.zappar.com when you scan a code - you should see the asset zip and minified javascript being transferred (JS is only minimised on publish, not preview, so preview is easier to debug).

As I mentioned above, we completely accept Studio could work a bit harder to clean up when exporting zpps, though there are some subtleties there (for example we support undo/redo of Media Library actions so at least whilst the project is open Studio needs to keep some older versions around).

2 Likes

I can understand keeping it around locally, but when I am uploading 881mb for what was manually reduced down to 60ish mb there is a serious issue. There is no reason to upload the full active project. If I am downloading a .zpp I only want what was part of that publish. Please ignore all history on upload or give the user the option to include it.

1 Like

Thanks for your suggestions. We’ve got a ticket to make these improvements in our internal tracker. In the interim there is now the command line utility that can remove unneeded files from the exported zpp files, which you can then of course re-import to achieve those savings on the upload for the publish process.

I can’t test this as trying with Chrome 84.0.4147.135 on Windows locks up the Zappar App on my phone.

My suggestion was to visit https://web.zappar.com on your desktop/laptop, as long as you’ve got a webcam you can scan codes there and access developer tools easily. Device debugging in Chrome (ie not Zappar app) would be another route.

It’s likely developer tools has paused on an exception if those settings are enabled. You should be able to hit play in the debugger to continue.