-
-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Final Work Product Submission for GSoC 2021 - Anais Gonzalez #5382
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
stalgiag
merged 4 commits into
processing:main
from
anagondesign:anais_gonzalez_gsoc_2021
Aug 21, 2021
Merged
Changes from 1 commit
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
105 changes: 105 additions & 0 deletions
105
contributor_docs/project_wrapups/anaisgonzalez_gsoc_2021.md
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,105 @@ | ||
# Improving the p5.xr Library Through Artistic Examples | ||
|
||
#### by [Anais Gonzalez](https://anaisgonzalez.design) | ||
|
||
## Overview | ||
Over the course of this summer, I worked on improving the p5.xr library by creating a series of artistic examples with the guidance of my mentor Stalgia Grigg. The p5.xr library is an add-on for p5 that adds the ability to run p5 sketches in Augmented Reality or Virtual Reality. It does this with the help of WebXR and anyone who is familiar with p5 can experiment with this library as long as they have the necessary equipment. | ||
|
||
The major goals of this project were to explore the possibilities of creative coding in p5.xr and show others how they can use p5 to work with the core concepts of immersive mediums. To accomplish this, the different themes of this project were broken down into a collection of simple and complex examples. The simple examples focused on the technical aspects of how to utilize VR specific functions within p5.XR while the complex examples were more of an abstract/creative exploration of all these concepts. | ||
|
||
|
||
|
||
## Work | ||
|
||
### [Example #1: Immersive Typography](https://github.com/stalgiag/p5.xr/tree/master/examples/immersive-typography) | ||
This was the first theme I started working on since I like experimenting with type. I started thinking about ways in which I could immerse myself with letter forms and tried to imagine what that would look like through some sketches. | ||
|
||
 | ||
 | ||
|
||
I thought of including 3D shapes with moving typographic textures on them and floating letterforms that were scattered throughout space. When I started working on it in VR, I learned about how to use [intersectBox()](https://p5xr.org/#/reference/raycasting?id=intersectsbox), a VR-specific function that allows you to trigger changes in a box of a specific size by using raycasting. This function ended up being the basis for the basic example where I use it to change a box's stroke color and text just by looking at it. | ||
|
||
 | ||
|
||
For the complex example, I started working on typographic textures in WEBGL first before bringing them in VR. These are some different versions of these tests: [1](https://editor.p5js.org/agonzal019/sketches/ZTjSOBQ7L), [2](https://editor.p5js.org/agonzal019/sketches/aFxmSlZ2w), [3](https://editor.p5js.org/agonzal019/sketches/PTaRhklbv). One of the first things I struggled with was not knowing how to use timing properly to reset an array. After talking with Stalgia about it, they taught me about how to use modulo, which returns the remainder of something after division. This line of code would play a big role in many of the other examples I've created. This is where I encountered my first issue. I found out that using plain text in VR was difficult because the text would only be visible at certain angles, so I decided to keep on using createGraphics() to display the type instead. This process was going well until I tried to use deltaTime in one of the earlier versions of this example. The WEBGL example's timing changes functioned perfectly in the browser but when I brought it into VR, the letters in the array wouldn't switch. Luckily, after posting an [issue](https://github.com/stalgiag/p5.xr/issues/133) about it, the problem was resolved and deltaTime and millis() were now functioning. | ||
|
||
After that was resolved, I finished the complex example by combining different parts of my earlier drafts all into one piece. I used intersectBox() to increase the scale of the box upon viewing it, the array of changing letterforms scattered through space that used deltaTime, textToPoints, and a planetary structure with rotating text to make my own galaxy of typography. | ||
|
||
 | ||
|
||
PRs in this section: [#137](https://github.com/stalgiag/p5.xr/pull/137) , [#138](https://github.com/stalgiag/p5.xr/pull/138) | ||
|
||
|
||
|
||
### [Example #2: Visual Art Making Tools](https://github.com/stalgiag/p5.xr/tree/master/examples/visual-art-making-tools) | ||
|
||
 | ||
|
||
I wanted to be able to draw a few solid colors in the basic example and then draw different textures for the complex example. | ||
|
||
I started experimenting with using 3D shapes as drawing tools in WEBGL by removing background() from draw(), but quickly ran into problems when trying to do this same method in VR. I learned that if background() is put into draw(), one of the eyes of the headset becomes completely blocked out. This is because draw runs twice in VR (once per eye), which is why [setVRBackgroundColor()](https://p5xr.org/#/reference/vr?id=setvrbackgroundcolor) goes in setup, so that the background is cleared after rendering for each eye. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
Since I couldn't use this method of not drawing the background to draw, Stalgia showed me a different method which creates an array of objects that are drawn at the x, y, and z positions of the viewer's controller. Now that the positioning was correct, we had to use [generateRay()](https://p5xr.org/#/reference/raycasting?id=generateray) to create a ray originating at the hand's location in order to activate intersectSphere. It's also necessary to use **"applyMatrix(hand.pose)"** so that the tilt of one's hand is tracked as well. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
 | ||
|
||
After I was able to actually draw something, I started thinking about ways in which I could add more variety. For the basic example, I used [intersectSphere()](https://p5xr.org/#/reference/raycasting?id=intersectssphere) to change the color of the brushstroke. This method of using ray intersection to change things became tedious in the complex example. I'd been using this method to change the color, size, and shape of the brush until I discovered that I could [utilize other buttons on my controller](https://github.com/stalgiag/p5.xr/blob/master/src/p5xr/core/p5xrInput.js) besides the trigger, so I started using those instead. *One thing to note for the Oculus Quest 2 is that the input code for the touchpad buttons does not work at all.* | ||
|
||
 | ||
|
||
For the textures in the complex example, I initially wanted to use a collection of custom textures made in p5 as textures for the brushstroke, but that caused the sketch to run incredibly slow, so I improvised. I took screenshots of my textures, manipulated the images in Photoshop, and then used those images as my final textures for the sketch. I then made everything more fun and chaotic by randomizing the texture, shape, and size of the brush automatically when someone draws. | ||
|
||
PRs in this section: [#140](https://github.com/stalgiag/p5.xr/pull/140) , [#141](https://github.com/stalgiag/p5.xr/pull/141) | ||
|
||
|
||
|
||
### [Example #3: Immersive 360](https://github.com/stalgiag/p5.xr/tree/master/examples/immersive-360) | ||
|
||
 | ||
|
||
I created p5 animations in the browser and then displayed them within VR by using a specific function called [surroundTexture()](https://p5xr.org/#/reference/vr?id=surroundtexture). Normally intended for displaying 360 photos, this function creates a very large sphere with inverted scale that surrounds the viewer. Regarding functionality, both the basic and complex examples allow the viewer to switch between states by pressing the trigger button. For the complex example, I also included some typographic animations to stay consistent with my style. | ||
|
||
 | ||
|
||
PRs in this section: [#145](https://github.com/stalgiag/p5.xr/pull/145) , [#146](https://github.com/stalgiag/p5.xr/pull/146) | ||
|
||
|
||
|
||
### [Example #4: Physics](https://github.com/stalgiag/p5.xr/tree/master/examples/physics) | ||
I had never worked on a physics example before, so I tried watching some Coding Train tutorials on strings, but couldn't get that to function correctly in VR. After speaking with my mentor about it, they showed me a working physics example that I was able to expand upon for the complex version of this theme. The basic example includes boundaries and a ball that can be held and thrown around. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
 | ||
|
||
For the complex example, I made the Ball class from earlier generate multiple balls at random x, y, and z locations that could change size, shape, texture, and color the moment they'd collide with a boundary. I also tried to include type textures on the shapes too, but they didn't display correctly for some reason, so instead I displayed the type textures on the boundaries of the room. I eventually removed the ability for the ball to change shape or texture since it felt too busy and just left it so it changes only size and color upon collision. Once I added in the other walls and ceiling, the whole thing really came together. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
 | ||
|
||
PRs in this section: [#143](https://github.com/stalgiag/p5.xr/pull/143) , [#144](https://github.com/stalgiag/p5.xr/pull/144) | ||
|
||
|
||
|
||
### [Example #5: Embodiment](https://github.com/stalgiag/p5.xr/pull/147) | ||
For the embodiment example, my mentor explained some other XR functions to me that helped me try to position things in VR. We can get the location of the camera with viewerPosition and we can also get the pose of the camera with viewerPoseMatrix. We can use "applyMatrix(viewerPoseMatrix);" on the head of the body, which allows it mirror the direction/pose that the viewers head is moving. By putting viewerPosition inside of translate, now the other parts of the body will be relative to the location of the head. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
 | ||
|
||
I didn't get to finish the complex example because I ran out of time. I wanted to create a dragon that the viewer could look at and move with, but I was having trouble converting the size of the dragon's body in WEBGL to the correct dimensions in VR, which are extremely small whenever hand positioning is included. I only managed to get the head and wings working so far. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
PRs in this section: [#147](https://github.com/stalgiag/p5.xr/pull/147) | ||
|
||
|
||
|
||
## Work Pull Requests and Issues | ||
stalgiag marked this conversation as resolved.
Show resolved
Hide resolved
|
||
* All of the pull requests made as a part of the project can be found here: https://github.com/stalgiag/p5.xr/pulls?q=is%3Apr+author%3Aanagondesign+created%3A%3C2021-08-23 | ||
stalgiag marked this conversation as resolved.
Show resolved
Hide resolved
|
||
* All of the issues opened as part of the project can be found here: https://github.com/stalgiag/p5.xr/issues?q=is%3Aissue+author%3Aanagondesign+created%3A%3C2021-08-23+ | ||
|
||
stalgiag marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
|
||
## Future | ||
* Could include specific input button controls for the Oculus Quest 2's x, y, a and b buttons because right now the gamepad lines of code don't function for Oculus even though they work fine for the Vibe. | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
* Could figure out why including text in VR sometimes makes the sketch run much much slower than it would if it were running in WEBGL mode | ||
anagondesign marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
|
||
|
||
## Conclusion | ||
Even though it was challenging for me as a novice coder working with an experimental program, I had fun making these examples and I learned so many new things! First and foremost, I'd like to thank my mentor Stalgia Grigg for all the patience, kindness, and encouragement they've given to me this past summer. They've been such a great mentor to me and I don't think I wouldn't have gotten this far in the program without them and their guidance. I would also like to thank the Processing Foundation and Google for giving me this opportunity to contribute something cool to their community <3 |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.