Tuesday, January 17, 2017

Team PPJ 1/10/17 - 1/17/17

This week a lot was done on that back end, so this blog post is going to be very light on images. A lot of the programming side was working on performance improvements and converting blueprints to C++. As well, one of the coders created a system for creating new tools in preparation for our impending addition of an off-hand player tool. In addition, we had a combination of coders and DIGM team members working on the main menu, and the basic functionality and visuals are in game and look pretty good. Finally, an AI model has been added to the game, and is awaiting rigging and texturing. However, it's a big step up from the lovely test cylinder we had gracing our game before.

Going forward, we are planning to improve a lot of visuals for GDC, as well as adding a new mechanic to our game to make use of a new offhand tool. Hopefully we will be able to figure out a better workflow for capturing mixed reality, but that will be only if we have extra time in between working on the game itself.

  •  Lots of visual progress
  • Main menu scene very close to being implemented in game
  •  Mixed reality capturing is finicky
  • Team meetings are scattered due to new schedules

Daniel Ingman PPJ 1/10/17 - 1/17/17

This week I spent time finishing up a greybox AI model and prep for the GDC competition. This week was lighter on the visual design aspect, since a great deal of the work I did was logistical and housekeeping, such as preparing a presentation for GDC and working with teammates to record footage for an updated trailer.

beep boop
When it came to shooting footage, Cory's blog post covered it pretty well but it turns out that shooting mixed reality footage is not nearly as easy as it was suggested to us, and it requires a lot of logistical work as well as making sure our game can support mixed reality shooting. Unreal doesn't support mixed reality beyond the promise of it in a future update, and so we spent far too much time trying to fudge it. We will attempt mixed reality at some point in the future, but not for the GDC competition.

  • AI model finished
  • Preparation for GDC
  • Mixed reality didn't work
  • Lots of time spent trying to shoot footage with little progress
 Total time spent: 12 hours
  • AI Model: 4 hours
  • Shooting footage: 6 hours
  • GDC preparation: 3 hours

Monday, January 16, 2017

Cory Zicolella PPJ 1/10 - 1/16

This week was alot of trouble shooting!  While I went to the meetings which were considerable lengths of time, the bulk of the week was spent trying to get the video ready for the Intel competition up to snuff.

This, however, took much problem solving and deliberation.  We really wanted to obtain Mixed VR footage, but didn't much know how to do it.  This resulted in us looking it up -- but come the time for recording, we were slightly pressed for time.  Which leads to the next time-consumer, which was setting up Mixed VR without necessarily being able to (due to our game not having a second camera capable of being coded in time).  This required a standalone camera filming the footage, and us physically moving the VR headeset to be beside it, off of the play area and person, vertically alongside it.

The reason it needded to be vertical is that in our research we discovered the only mixed reality builds used an editor trick to capture widescreen footage, or they recorded one eye and zoomed in the screen physically while recording at a high resolution, or they simple left it normal recording and displayed logos on the otherwise blank space.

The first option wasn't viable because it requires a many-hour install of a different version of editor, and we were in a different room than usual.  The middle option simply wasn't an option for our game, as it is a faster paced sports game, so we decided to get widescreen footage via a vertical vive, and in post rotating and enlarging the footage to fill the screen.

That alone took about 5 hours.

The rest of the time was spent in the lab, attempting to gain normal first person footage for the in-between shots, to otherwise demonstrate the game.  Unfortunately, there were version issues with certain changes not being pushed and only available in the build for the mocap room, so we spent about 90 minutes troubleshooting these issues with one of our coders.  By the time it got sorted, we only had about 20 minutes to get footage before it was late at night, at which point I retired for the evening.

So I have all the mixed VR footage, and some first person footage, but none of it is edited.  We also could probably use morer first person footage.  That being said, I have another day -- Tuesday -- to get it situated before the presentation.  Here's to prrogress and cool VR videos (hopefully)!

Mike Cancelosi Blog Post Winter Week 1

This week I helped push the game forward to be ready for the GDC presentation.  I helped improve the AI and the menu for the the coding side. I also added the ability to change the color of the gravity well spiral. I also created a material for the menu text.

Hours Spent : 13 hours

Pros : The menu scene is pretty far along.
Cons : The AI keeps breaking when we push/pull

Johanna Oberto PPJ 1/10/17 - 1/17/17

It was a bit harder for me to get back in the swing of things this week, after a month of being away from the game. I need to refamiliarize myself with the code and where everything is now, as many edits have been made over the past month that I am not yet acquainted with. This week I attended meetings to plan the coming weeks' work, and helped to work on the GDC presentation.
Got my head back in the game
Have a focus on priorities
Must refamiliarize with game and code

Andrew DiNunzio PPJ 1/10/17 - 1/17/17

This week, I started working with Mike C and Mike D in making the ball not have infinite range at any angle. We gave the game balls an invisible larger sphere around it, so it would be easier for players to grab it, and they don't have to hit a small ball while it moves at high velocities. In doing this, the whip can now work when there are multiple balls in the scene.

Then, I spent some time finishing up the decoupling of the whip from the player's motion controller. All tools have a common interface for "engaging" and "disengaging", as well as a "trigger axis" which the Vive's motion controller uses for how much its trigger is pushed in. Decoupling this logic also made it possible for the AI character to use the exact same code for the tools as the human player does.

In doing this, I also put the pieces in place to make it easier to create additional tools. As a proof of concept, I created a "psychic wand" tool in one night, and I documented the whole process here (Google Docs). I did this to make it easier for other developers and designers to understand what steps are involved in making new tools.

Basically, it is a tool that holds a charge and has a cool-off period when the charge is up (while it recharges to its full charge). When the player holds the trigger (assuming they don't miss), it "attaches" to the ball in front of it and when the player moves his or her hand, a force is applied to the attached ball depending not only on how much the trigger is held down, but also how quickly they moved their arm. The charge is depleted only based on how much the trigger is held down (and does so regardless of whether or not they successfully grabbed a ball).

I posted a video of it here (Google Drive).

I didn't work on fixing the whip's damping problem (since I'm still not totally sure how I can go about fixing it yet, and I felt like in doing this, my time was better spent).

In terms of my reading, I finished chapters 5 and 6 in C++ primer, and I'm partially through chapter 7. I didn't do any of the exercises yet, but I figured I'd go back and do all of them after Part I (at the end of ch 7).

Time spent: Total: ~25 hours

10 hours - Decoupling the whip from the player's controller completely
10 hours - Creating proof-of-concept "psychic wand" tool and documenting the process
5 hours - Reading C++ Primer

  • Whip is now it's own entity completely and can be used by the AI
  • Common interface for input to all tools
  • Tools are now much easier to make, and the entire process is documented
  • I learned quite a bit about C++

  • Whip damping problem is still not fixed

Tyler Schacht PPJ 1/10/17 - 1/17/17

If you are dying to know, I got my computer back the beginning of the week, which I am super ecstatic about.  And with my  computer back in commission, I was able to get back to work on the website.  Some of the things that I implemented into the website: a fluid movement when using the navigation bar, video "cover page" to show off the game in motion, and we wanted to have a fun little easter egg when you hover over (most) portraits.  In addition to the creation of the main website, the Blogger got a bit of a touch-up to match the general feel of the main site.

I also helped capture video footage with Cory, Dan, and Mike DiLucca

Screenshot of current website

Screenshot of new the Blogger
Total Hours: 18 hours
- Main website: 8 hours
- Blogger Update: 1 hour
- GDC game footage: 5 hours
- Meetings: 4 hours

Positive: I was able to pump out this website in less than a week, and I think it looks pretty nice
Negatives: We are still working on the game so sections of the website are kind of in a "placeholder" state, such as the video background and the "About" section.

Sunday, January 15, 2017

Ryan Badurina - Jan. 10 - Jan. 17, 2017 PPJ

This week was stressful on so many levels.  The whole team was working in overdrive to create and finalize enough assets, trailers, presentations, etc. before the Intel presentation this coming week, and what I was tasked with was creating a new main menu / elevator wall that would act as the "accept" button for level, AI, and overall player selection.  The design is that the player would throw balls that acted as "physical" settings, and enable selection of various aspects of the game by throwing those balls into a tube.

Front version of the goal without textures.  Uses separate primitives to get a basic yet complex shape.

Close up of LED lights and the "strip they are connected to.

Length of the tube.  I wanted to bring it back far enough so it looks like it has depth and that the ball dissapears into it.

Once the modeling and UVing are done, I went into photoshop and made textures to fit the model.  However, rather than apply them in Maya, I instead applied the textures and material edits in Unreal itself to give me better visualization and render quality.

Material layout for LED lights.  Uses a Constant Vector going into a Multiply node to boost emmisive brightness.

After some work, got good collisions working, and tested it out with a basic Third Person Character outside of the project.

Current version of the main menu wall.

It took a bit of time and work, but the wall looks pretty good and presentable for our showcase this week.  I talked with my teammates about what else we should do with the wall, and we discussed that we would go back to this later in the term to re-work it for proper animation and such.

Other things I aided with included our presentation for the Intel Competition this Wednesday and organizing our Google Drive to make it more clean.  Small stuff, but stuff that helps benefit our overall productivity and readiness for Intel.

  -Weekday Meeting: .5 Hours
  -Weekend Development Session:  6.5 Hours
  -Presentation Development:  1 Hours
  -Main Menu Wall Design:  2 Hours
  -Main Menu Wall Model:  4 Hours
  -Main Menu Wall Textures:  1.5 Hours
  -Main Menu Wall Materials:  1.5 Hours
Total Hours:  17 Hours

  -The wall looks really good for our current purposes, giving proper visual ques to the player and better environmental cohesiveness.

  -Taking and handling 7 classes alongside part time work is very stressful, giving me very little time during the week to sit down for more than 2 hours and work on assignments straight through.
  -Preparation and work for the Intel Presentation took up a lot of my time this week.