Lua API for VR

Everything about development and the OpenMW source code.
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Lua API for VR

Post by madsbuvi »

The main obstacle to merging VR is the sheer complexity of its MR, with a *lot* of changes scattered pseudo-randomly about the code and making duplicates of many UI files for small changes. One step to reducing this complexity is to re-write as much as possible of the MR as a Lua mod. This will also allow modders tools to improve or build on the VR experience. And may on the way provide modding tools for flatscreen openmw as well.

Engine requirements:
Evidently, not every part of VR can be moved to a Lua script. The engine must implement a few things. Both Stereo and VR should be enabled as a start-up switch, either by compiling separate binaries as is currently the case, or by some command line parameter.

Requirements:
A summary of what the Lua API needs to achieve.
  • Tracker to actor bone mapping.
    Motivation: Self-evident
  • Tracker to camera mapping.
    Motivation: Self-evident
  • Modifying visibility of actor parts.
    Motivation: Many people will find disembodied limbs to be a more comfortable VR experience.
  • Arbitrary control of actor bones.
    Motivation: Implementing visual gestures such as finger pointing.
  • Read world space or model space position of any part/bone
    Motivation: Implementing realistic combat and archery
  • Input subactions
    Motivation: VR inputs differ from other inputs in that they may differentiate between what limb triggered the input.
Prerequisites:
  • Lua API for GUI - For modifying existing UI items and providing new UI items, rather than having to modify existing UI layout files.
API Summary:
A summary of intended APIs and their motivation
  • Actor Part Manipulation API
    Needed to map tracking data to actor bones. This API shall allow arbitrary manipulation of the position, orientation, and visibility of any and all actor parts.
  • Input Actions and sub-actions
    Needed to extend the input API with VR specific controls. Sub-actions are needed to differentiate control source, such as left vs right hand controllers, if this is relevant for an action.
  • VR API
    An API providing access to VR specific information and methods.
  • 3D GUI API
    Needed to render and interact with GUI as 3D geometry placed within the scene.
I'll make one follow up post for each of these points
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

Actor Part manipulation API:
While single player VR only needs to modify the player's parts, TES3MP may need to modify other actors in the scene. Therefore these should be implemented at the actor level, rather than the player specifically.

To identify body parts i am basing myself on an extension of ESM::PartReferenceType i named PART. ESM::PartReferenceType is used by the npcanimation class to identify parts. However, in my use case i need access to more parts. E.g. i use L Finger1 and R Finger1 and each of their immediate children so i can implement finger pointing. I will also need access to the eyes to implement eye tracking in the future, and modeling the camera as a PART is useful for controlling the VR camera.

The alternative to the PART enum is a string identifier. The advantage of this alternative is that modders are not limited to what we put in this enum. In return part lookup will be slightly slower (probably not noticeably) but more importantly documentation will need to be provided for the exact string identifier of each PART.

I am OK with either implementation, so i think it's up to what other devs/modders think is a good idea.

To implement PART manipulation, i propose an abstract "Attachment" type, which can be attached to any actor's PART. One public Attachment type will be the StaticAttachment type, which allows direct manipulation by the script. VR will provide a private attachment type which gets its tracking data from OpenXR. VR equipment may also come with a varying number of trackers, so rather than forcing scripts to deal with a varying number of attachments, i propose a "Marionette" type which collects a varying number of attachments. This marionette type can then also hide logic such as translating from VR STAGE coordinates, to WORLD coordinates, by using its attached actor as its anchor.

extends openmw.types with

Code: Select all

// Enum listing all possible parts
TYPE PART

PART.Head
PART.Hair
PART.Neck
PART.Cuirass
PART.Groin
PART.Skirt
PART.LFinger1Joint1
PART.RFinger1Joint1
PART.LFinger1Joint2
PART.RFinger1Joint2
PART.RHand
PART.LHand
PART.RWrist
PART.LWrist
PART.Shield
PART.RForearm
PART.LForearm
PART.RUpperarm
PART.LUpperarm
PART.RFoot
PART.LFoot
PART.RAnkle
PART.LAnkle
PART.RKnee
PART.LKnee
PART.RLeg
PART.LLeg
PART.RPauldron
PART.LPauldron
PART.Weapon
PART.Tail
PART.LEye // Will allow eye tracking
PART.REye
PART.Camera

TYPE Attachment

Attachment.attachTo(Actor, PART)
Attachment.detach()
Attachment.isStale()
Attachment.disable()
Attachment.enable()

// Used to override the animations/positions of some or all of an actor's PARTs.
TYPE Marionette

Marionette.PARTS // List of parts being controlled by this marionette 
Marionette.attachTo(Actor) // This marionette will take over control of all parts shared by the actor and the marionette.
Marionette.detach() // Releases the actor
Marionette.partPosition(PART) // Get the current position of a part relative to the actor's root.
Marionette.partRotation(PART) // Get the current orientation of a part relative to the actor's root.
Marionette.disablePart(PART) // Disables control of the specified part
Marionette.enablePart(PART) // Re-enables control of the specified part
Marionette.getAttachment(PART)
Marionette.setAttachment(PART, attachment)
Marionette.removeAttachment(PART)

// Can be used to override a PART with position and rotation values supplied by scripts.
// An example use case is to make fingers point straight by setting the rotation override to 0 on all joints.
TYPE StaticAttachment EXTENDS Attachment

StaticAttachment.setRotationOverride(rotation, adjust)
StaticAttachment.setPositionOverride(position, adjust)
StaticAttachment.setRotationOverrideEnabled(bool) // default false
StaticAttachment.setPositionOverrideEnabled(bool) // default false
StaticAttachment.setReferenceFrameAbsolute(bool) // default false
extends openmw.Actor with

Code: Select all

Actor.PARTS // available parts
Actor.hidePart(PART)
Actor.showPart(PART)
Actor.partPosition(PART)
Actor.partRotation(PART)
Discussion point:
Not directly related to Lua, but has to be solved for this API to work: Movement accumulation interacts very poorly with VR, as it bobs the player_root node around. This is uncomfortable and i disable it as much as i can. However, if i outright disable it, that causes animations to run away from the camera. Which i solve by just hiding all parts. I could use some inputs from experts on the animation system to solve this.
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

Input Actions and sub-actions
It's should point out that OpenXR's api for specifying bindings is limiting. It's design seems to be intended to let VR Runtimes define their own rebinding APIs, similar to what SteamVR offers for OpenVR games. The most limiting feature is that bindings cannot be modified after initial setup, which is incompatible with allowing a Lua API for modifying bindings. Therefore this is not considered as a possible Lua API.

Instead this API consists only of extending openmw.input with new VR specific actions, and subactions.

Extends openmw.input with VR specific inputs.
extends openmw.input#ACTION with

Code: Select all

ACTION.VrMetaMenu
ACTION.PointerActive
ACTION.Recenter
ACTION.RadialMenu
defines type openmw.input#SUBACTION

Code: Select all

// Defines the VR control source of an action
TYPE SUBACTION

SUBACTION.Head
SUBACTION.LeftHand
SUBACTION.RightHand
SUBACTION.Gamepad
extends openmw.input with

Code: Select all

// Returns true if the given subaction is currently activating the actionId.
// Returns false if not. Note: Always returns false if the action does not have defined subactions.
input.actionSubaction(actionId, SUBACTION)

// Returns a list of subactions that are currently activating the actionId.
// If the action is inactive, or does not have defined subactions, the function returns an empty list.
input.actionSubactions(actionId)
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

VR API:
defines openmw.vr

Code: Select all

local vr = require('openmw.vr')

Code: Select all

// Returns true if VR is active. All VR specific scripts should check this, so players don't have to swap out scripts between VR and FLAT.
vr.isVr()

// Returns true if VR motion controllers are active
vr.isMotionControllerActive()

// Returns true if the left hand controller is active
vr.isLeftHandActive()

// Returns true if the right hand controller is active
vr.isRightHandActive()

// Returns true if left handed mode is active
vr.isLeftHandedMode()

// Returns a marionette that can control an actor using available tracking data
vr.playerMarionette()

vr.isSeatedMode()
vr.setSeatedMode(bool)

// Request a recenter of the VR view. This eliminates all current difference between the actor and the camera in the XY plane by moving the camera back to the actor.
// If recenterZ is true, and seated mode is enabled, then the eye level is also adjusted to bring the camera to eye level where the player is currently seated.
// recenterZ has no effect in standing play.
vr.recenter(bool recenterZ)
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

3D GUI API:
Lua API for GUI is a prerequisite of this. I will work on drafting this module once that is in place. As is, i do a lot of tailoring for each layer in apps/openmw/mwvr/vrgui.cpp . Which might be just fine, since moving this to Lua means all this tailoring will be softcoded as lua scripts.
User avatar
urm
Posts: 83
Joined: 02 Jun 2017, 16:05
Gitlab profile: https://gitlab.com/uramer

Re: Lua API for VR

Post by urm »

Input Actions and sub-actions

I have an ongoing MR for an extendable input action API, it will be easy to add the VR actions with that https://gitlab.com/OpenMW/openmw/-/merge_requests/2628

> The most limiting feature is that bindings cannot be modified after initial setup

What if we implement the gameplay bindings one layer deeper? As in we bind "everything" in OpenXR into Lua, and expose that as engine handlers and methods in `openmw.input` (like mouse and keyboard), rather than actions, and implement bindings from that to actions in Lua? If there is a common interface for OpenXR bindings, we could even expose the list of those bindings as a config file, and enable users to accommodate unusual headsets and such in the future.

> defines type openmw.input#SUBACTION

It's not clear what sub actions are aiming to achieve. Is the idea that each action has a right and left hand version, and potentially a head one too? I think that would be a part of the input binding logic. So the left/right hand buttons are exposed in the `input` package, and the mapping of those to e. g. VrMetaMenu would happen on Lua side. I.e. same difference as between knowing that mouse has left and right button, and mapping left mouse button to attack.

This kind of abstraction would also make it easier to support gamepad + VR gameplay, which is a requested feature as far as I know.

> vr.isMotionControllerActive(), vr.isLeftHandActive(), vr.isRightHandActive()

Should we have a more general interface for that? E. g. do we want to support full body tracking and such in the future?
Although "hard-coding" methods for hands makes sense anyway, they are likely to be required for any practical implementation. What is `isMotionControllerActive`? is it the same as `vr.isLeftHandActive() or vr.isRightHandActive()`?

> vr.isLeftHandedMode()

does this have any significance other than for binding controls? E. g. if we have a way to rebind inputs arbitrarily between actions and buttons on the controllers, can this be replicated with an input configuration?

> vr.playerMarionette()

Should this be a part of the vr package? I understand that it's relevant here, but could it not be used for other things, e. g. procedural animation? Or is this an abstraction over some lower level package that can be exposed elsewhere?

> vr.recenter(bool recenterZ)

Should we also offer more granular control over this offset? E. g. for implementing custom recentering logic? (whether any other logic is a good practice is a separate question)

> vr.isSeatedMode()

Is this as simple as changing the player height, and changing the behaviour of `vr.recenter`? Can we "dehardcode" this from the get-go by giving Lua control over the offset from camera to player actor, and having that implement recenter and seated mode?

> difference between the actor and the camera

There are a few Lua mods now which actively use Static camera mode. VR should probably still support those. so this difference would be between the VR camera and the camera position as set by the Lua API, rather than player actor's eyes.
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

> It's not clear what sub actions are aiming to achieve.
Sub-actions are aimed at actions that change behavior depending on what input source activated it. e.g. a pointing action can easily discern which hand should be pointing based on sub-actions. It is, naturally, an XR specific concept.

Currently openmw-vr Lua only has one use-case for sub-actions, the pointeractive action. It could be bound to some input on the left hand, the right hand, or both. When the action is active, it could be active on the left hand only, the right hand only, or both. In order to know which hand to give the pointer animation and laser beam, i must know which hand is triggering the action. Sub-actions are the alternative to creating separate action IDs for left and right hand pointing.

Functionally sub-actions are a complement to bindings. When an action is created in OpenXR, it is assigned zero or more sub-actions. Assigning sub-actions are totally optional and input bindings do not need to match sub-actions to be valid. Sub-action information is only provided for any sub-actions that were assigned to an action.

As you might guess, the pointeractive action is assigned the LeftHand and RightHand subactions. The only other actions with subactions is an under-the-hood tracking action, as tracking data is framed as pose input actions in openxr, and the haptics output action. I do not currently have any plans of exposing either of those to Lua, although that could be one alternate way of exposing tracker data.

> What if we implement the gameplay bindings one layer deeper? As in we bind "everything" in OpenXR into Lua, and expose that as engine handlers and methods in `openmw.input` (like mouse and keyboard), rather than actions, and implement bindings from that to actions in Lua? If there is a common interface for OpenXR bindings, we could even expose the list of those bindings as a config file, and enable users to accommodate unusual headsets and such in the future.
I have considered this. In many ways i would be OK with it as it's more easy for the end user to understand than dealing with OpenXR bindings. Currently, they are exposed in a bindings file, but this file deals with openxr paths directly and is therefore a really bad user experience. We would have to re-implement sub-actions ourselves, but that's easy.

The main motivation for not doing this before, is that OpenXR bindings are framed very similarly to OpenVR bindings, in that they are merely "suggestions", and there is an expectation that VR runtimes should offer a framework of rebinding controls similar to what steamvr offers. Sadly, i don't think a single VR runtime (not even steamvr) have offered such a framework for OpenXR yet.

> Should we have a more general interface for that? E. g. do we want to support full body tracking and such in the future?
Although "hard-coding" methods for hands makes sense anyway, they are likely to be required for any practical implementation. What is `isMotionControllerActive`? is it the same as `vr.isLeftHandActive() or vr.isRightHandActive()`?
They are meant to check whether left/right hand motion controllers are active, separately from tracking. isMotionControllerActive is a shorthand for vr.isLeftHandActive() || vr.isRightHandActive(). All 3 are useful as they let you adjust behavior depending on what controllers are active. isMotionControllerActive is the most useful as it disambiguates whether the game should be in keyboard+mouse/gamepad mode, or using motion controls. In C++ the method is the inverse, VR::getKBMouseModeActive. which may be better and more informative than isMotionControllerActive so i can revert it to that.

> vr.isLeftHandedMode()
Since i implemented sub-actions, the primary use of this is pointing defaults. When BOTH hands are pointing, this variable informs me which hand should receive the pointing laser (as i only support one at a time). When in GUI mode, BOTH hands are ALWAYS pointing. If it becomes possible to accomplish changing which hand weapons and shield are attached to, this variable can also inform of the need to flip which hand should be the weapon and shield hands.

I ran out of time, i will respond more tomorrow!
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

> vr.recenter(bool recenterZ)

Should we also offer more granular control over this offset? E. g. for implementing custom recentering logic? (whether any other logic is a good practice is a separate question)

> vr.isSeatedMode()

Is this as simple as changing the player height, and changing the behaviour of `vr.recenter`? Can we "dehardcode" this from the get-go by giving Lua control over the offset from camera to player actor, and having that implement recenter and seated mode?
I think you are right. All of vr.isSeatedMode(), vr.recenter(), and vr.isLeftHandedMode(), should be soft-coded (i.e. not part of the openmw.vr Lua package, but defined by the VR scripts themselves).

As far as offering more granular control of the offset. The "offset" here is the difference between the position of the player and the character in the player's VR stage. The only thing recenter can be is consumption of that difference. Anything else would not be recentering.
Now that I think about it, i think i am making a mistake lumping recentering and adjusting the eye-level as the same call.

> vr.playerMarionette()

Should this be a part of the vr package? I understand that it's relevant here, but could it not be used for other things, e. g. procedural animation? Or is this an abstraction over some lower level package that can be exposed elsewhere?
This call is meant to provide an instance of the marionette type defined in the Actor Part manipulation API, that contains all the trackers available to VR.

I think maybe i did a poor job of explaining my thoughts on the "Actor Part manipulation API" as i named it. This is by far the part i am the least sure about and would like feedback on, as my experience with animation in general is limited to the hacks i've written to make OpenMW-VR work.
One of my motivations for defining "Attachment" and "Marionette" types, is to avoid the need for scripts to have direct access to same-frame tracking data. The reason for this is that Lua scripts are the first update step of every frame, while tracking data is the last step. VR trackers work by predicting where each tracker will be when the frame will be rendered, which motivates to updating tracking last. If the update step is long, the difference in when tracking is updated is a few milliseconds and may make a difference in prediction quality.

By defining Attachment types, vr.playerMarionette() can return a type that runs the actual update of the player animation when tracking data is updated.
There are a few Lua mods now which actively use Static camera mode. VR should probably still support those. so this difference would be between the VR camera and the camera position as set by the Lua API, rather than player actor's eyes.
Can you give an example of these mods? I'd like to see what exactly the use case is. It may be that mods that manipulate the camera in some way should have to explicitly support VR to be compatible, depending on what it is they're doing. Since camera manipulations that work great on flat screens may be horrible in VR.
User avatar
urm
Posts: 83
Joined: 02 Jun 2017, 16:05
Gitlab profile: https://gitlab.com/uramer

Re: Lua API for VR

Post by urm »

> Currently openmw-vr Lua only has one use-case for sub-actions, the pointeractive action

What pointer actions are there? Is it just the activate , as in the actual visible pointer? Or anything else?

> Sadly, i don't think a single VR runtime (not even steamvr) have offered such a framework for OpenXR yet.

There is a bigger issue with using them directly. I've ran into it when thinking about potential Steam Input API support. It's that we don't actually know how many bindings we need until runtime, since mods can add new bindings (aka input actions). Normally that's not an issue for games, so any such re-binding interfaces expect the list of bindings provided ahead of time, which is not possible in our case.

> isMotionControllerActive is a shorthand for vr.isLeftHandActive() || vr.isRightHandActive().

That makes sense, except the `isMotionControllerActive` name is very confusing to me. I feel like the name should communicate that it's specifically "hand" controllers, or the left/right hand functions should mention `MotionController`.

> left hand mode

Would it not be easier to just have two different bindings, left hand and right hand activate, and enable the right laser that way. It would also allow bindings the left hand laser to right hand button, and vice versa, in case somebody prefers that.

> Anything else would not be recentering.

I've used that terminology because you've presented two ways to "recenter". Whatever we call it, there might be other meaningful ways to normalize the VR camera offset, e. g. do it automatically somehow. It's common for VR games to do something like that when the VR camera collides with a wall.

> marionette

I'm also fairly clueless about animations, and I'm not sure I understand exactly what you mean, but I doubt I would have anything useful to say here in any case.

> Can you give an example of these mods?

Primarily it's our darling OpenNevermind https://youtu.be/XpH05T7WtfE . I'm also working on a mod with similar camera functionality, but somewhat more ambitious (and with more free camera controls). Finally there are a few mods that enable free cinematic camera of some sort, and my Dramatic Entry https://youtu.be/b0_n1Bm8VCk (or other cutscene-like mods)
madsbuvi
Posts: 11
Joined: 28 Dec 2019, 15:53
Gitlab profile: https://gitlab.com/madsbuvi

Re: Lua API for VR

Post by madsbuvi »

What pointer actions are there? Is it just the activate , as in the actual visible pointer? Or anything else?
It's just the visible pointer, yes.

At this point i've more or less decided to make an input rebinding wrapper. If i do this, i could add subaction information to every binding, for modders to use any way they wish. An example use case for this, could be to change which hand spells are cast from depending on which motion controller did the casting, since in this case the "Use" action would come with subaction information. Without subaction information, you would have to make two new actions, one for left and one for right, and expect the player to bind it correctly for every such use case (if there are any more).
There is a bigger issue with using them directly. I've ran into it when thinking about potential Steam Input API support. It's that we don't actually know how many bindings we need until runtime, since mods can add new bindings (aka input actions). Normally that's not an issue for games, so any such re-binding interfaces expect the list of bindings provided ahead of time, which is not possible in our case.
I'm not sure i understand, that sounds like another reason to create our own bindings wrapper. I'm guessing that's what you're saying?
That makes sense, except the `isMotionControllerActive` name is very confusing to me. I feel like the name should communicate that it's specifically "hand" controllers, or the left/right hand functions should mention `MotionController`.
I expect most people in the VR space will understand the MotionController terminology. You're right though that vr.isLeftHandActive() isn't as clear as i thought, it's not so obvious that it means the motion controller is on. vr.isLeftHandControllerEnabled() sounds more right. vr.isLeftHandMotionControllerEnabled() would be even clearer but I think it is more verbose than necessary.
I've used that terminology because you've presented two ways to "recenter".
I assume you are referring to the recenterZ parameter? The purpose of this parameter is for seated play, and implies whether the eye level should be adjusted or not. As i mentioned in my previous post, lumping this in with the recenter() call is probably a mistake. recenter() should just recenter XY, and adjusting the eye level should be a separate call.

Since i currently make no attempt at doing collision for the camera, recenter() is only automatically called on startup, load game, and instant transition (entrances, recall, interventions, and similar). Instant transitions imo should not automatically adjust eye level as it would cause the eye level to change around arbitrarily as the player is playing, which might be noticeable and annoying. Therefore instant transition currently calls recenter with recenterZ=false. While all other calls to recenter calls it with recenterZ=true. I think it would be less confusing if adjusting the eye level was instead a separate call, something like resetEyeLevel(). This would also allow the use case of resetting eye level separately, e.g. during/following the race select dialogue.

Whatever we call it, there might be other meaningful ways to normalize the VR camera offset, e. g. do it automatically somehow.
I know skyrim just automatically recenters you if your head gets too far from your character, e.g. by walking into a wall. That just means the VR code is making a call to recenter whenever that condition occurs. I'm not sure what the confusion about terminology is here?

Code: Select all

It's common for VR games to do something like that when the VR camera collides with a wall.
Isn't that just collision? Not letting the camera move if it would be moving through the wall. Using the term "recenter" for this wouldn't make any sense since that's not at all what would be happening.
Would it not be easier to just have two different bindings, left hand and right hand activate, and enable the right laser that way.
You quoted vr.isLeftHandedMode(), but I think you're rather arguing against the necessity for subactions? GUI mode still needs leftHandedMode to inform it which hand should be pointing. I'm not gonna force the user to hold down the pointer action while navigating menus. Or make it a lightshow of dual wielding lasers. leftHandedMode() also has the future use case of switching what hands are holding the weapon, shield, and drawing the bow string (if i ever implement realistic archery).
It would also allow bindings the left hand laser to right hand button, and vice versa, in case somebody prefers that.
Enabling the pointer on one hand and having it appear in another seems like an awkward experience. I wouldn't want to actively prevent it if someone really wants that, but it's specific enough that I think they could mod it themselves by flipping the pointing hands in the script.
I'm also fairly clueless about animations, and I'm not sure I understand exactly what you mean
I'm sure i'm doing a very poor job explaining it too. The Marionette and Attachment types that i'm talking about here do not currently exist, they are suggestings for an API for animation overriding.

"Marionette" is just what i'm naming a type that holds a collection of "Attachment" objects, and collectively binds/unbinds them to an actor. This is so that VR scripts don't have to manage a varying number of Attachment objects based on what tracking is available or not.

The "Attachment" type is an object that attaches to an actor and overrides a specific bone. The StaticAttachment type is an instance of Attachment available to Lua that exists to do specific, pre-scripted changes to animations, such as forcing a finger to point forward.

vr.playerMarionette() offers an instance of Marionette that contains all Attachments generated by tracking data from the player. These are instances of Attachment that are not available to Lua, since they need to under-the-hood update their tracking data at the right time and then apply to the bones, which can't be done via Lua. The alternative would be something like vr.getPlayerTrackers() which returns a list of Attachments directly instead of wrapping them inside the possibly confusing Marionette type.
Post Reply