Disguise camera control

Hello everyone,

I would like to share with you “disguise camera control” layout.

As virtual events and XR workflows are becoming new normal, here is the layout which simulates camera tracking using osc. It could come in handy if you need to preview some notch blocks or any other xr setups for that matter without physical LED screens and camera tracking systems.

I made simple disguise project setup for this - if somebody wants to test it out, feel free to get in touch, I’ll be happy to send it over.
Also I am looking for some feedback, maybe there is something I can improve.

Layout is designed to run on iPad using Safari in full screen mode.

This is mainly intended for disguise users, however for O-S-C users it could be useful to see how it’s set up. Can’t guarantee that scripting I have done is elegant as I am totally new to this, but it is working as intended.


disguise Camera Control.json (181.8 KB)


thank you for sharing !

This is very cool!

I’m messing around along similar lines, but for controlling nDisplay directly.

But i may be tempted to just grab your template! :sunglasses:

Actually, any chance you could tell a bit about what the template does? I mean, i can see it, but not totally, because i’m a bit of a noob with OStageC. Any tricks worth pointing out?

My first steps using Open Stage Control to control my stage in nDisplay:

1 Like

Ah, that’s very nice! :slight_smile:
I haven’t got to nDisplay just yet, it’s on my to do list

In your video, you are moving camera, right? Or is it environment?

If it’s camera, then you can totally adjust this template. I assume you will need to change addresses for osc messages. If you enable debug on server, you can see what is sent out, or alternatively you can use software like Protokol from Hexler to pinpoint osc messages and values.

So with this template you can control xyz of cameras and pan, tilt, roll and zoom as well. Cool thing is that you can define how many cameras you want to use and switch between them while all values are saved (currently per session, but I think I’ll have to update the template as per save method from your post)
Is that something you can set up with nDisplay - multiple cameras and depending which camera is selected, content is rendered from that POV to screens?

Cheers mate!

Yes, in my vid i’m moving my POV, and that translates to a different perspective on the screens.

It’s certainly possible to do camera switching in nDisplay although, as i only have one camera, it’s not something i looked into yet.

I’m learning some stuff dissecting your template (i’m a total noob with anything coding).

One quick question: you’ve got your XY pad values reading out to a text box, and the code that’s doing that is complicated and i don’t understand it.

But at a very simple level, you could do it with: z @{xy_1}

But how do you choose to only display either the x or y? So, only display one array index?


If you add @{xy_1} in text widget value field, it will return both values - x and y;
but if you add #{@{xy_1}[0]) in text widget, it will display x value, and #{@{xy_1}[1]) will display y value.
(small example attached)
xy_readout.json (4.2 KB)
In my surface extra code in text field defines how many decimals are shown

I took my surface small step further - connected PS3:

But that made few things to break down, which I’m trying to troubleshoot now.

Nice, thanks! .......... i've been messing around too, a somewhat different approach, still at the conceptual stage!: