As virtual events and XR workflows are becoming new normal, here is the layout which simulates camera tracking using osc. It could come in handy if you need to preview some notch blocks or any other xr setups for that matter without physical LED screens and camera tracking systems.
I made simple disguise project setup for this - if somebody wants to test it out, feel free to get in touch, I’ll be happy to send it over.
Also I am looking for some feedback, maybe there is something I can improve.
Layout is designed to run on iPad using Safari in full screen mode.
This is mainly intended for disguise users, however for O-S-C users it could be useful to see how it’s set up. Can’t guarantee that scripting I have done is elegant as I am totally new to this, but it is working as intended.
I’m messing around along similar lines, but for controlling nDisplay directly.
But i may be tempted to just grab your template!
Actually, any chance you could tell a bit about what the template does? I mean, i can see it, but not totally, because i’m a bit of a noob with OStageC. Any tricks worth pointing out?
Ah, that’s very nice!
I haven’t got to nDisplay just yet, it’s on my to do list
In your video, you are moving camera, right? Or is it environment?
If it’s camera, then you can totally adjust this template. I assume you will need to change addresses for osc messages. If you enable debug on server, you can see what is sent out, or alternatively you can use software like Protokol from Hexler to pinpoint osc messages and values.
So with this template you can control xyz of cameras and pan, tilt, roll and zoom as well. Cool thing is that you can define how many cameras you want to use and switch between them while all values are saved (currently per session, but I think I’ll have to update the template as per save method from your post)
Is that something you can set up with nDisplay - multiple cameras and depending which camera is selected, content is rendered from that POV to screens?
If you add @{xy_1} in text widget value field, it will return both values - x and y;
but if you add #{@{xy_1}[0]) in text widget, it will display x value, and #{@{xy_1}[1]) will display y value.
(small example attached) xy_readout.json (4.2 KB)
In my surface extra code in text field defines how many decimals are shown
I took my surface small step further - connected PS3:
But that made few things to break down, which I’m trying to troubleshoot now.