David Cleverdon of 360 Immersive Discusses Virtual Reality and 360-degree Video Production
For those of you who prefer reading an article to watching a YouTube video, we have included a full transcript of the discussion below.
David Cleverdon 1: The reason that you are all here is for some respect, virtual reality, augmented reality or some immersive technology, and that’s why you’re here. So the reason we put on this first slide, the reason we put Benjamin Franklin’s quote, tell me, and I forget, teach me and I remember, but involve me and I learned is the core fundamentals of what we’re all about and what we’re all about is immersive technology and in our individual focuses, training and education. And so we’re going to talk a lot about that. If anybody’s into Marcus’s in developing a platform, and Josh, I don’t even know what Josh, he’s in the beer, the beer, but our focus is on a top level standpoint, is generally training and education and how vr and shooting three 60 immersive specific or a three 60 video specifically affects that. But if you’re into gaming or if you’re into education, maybe you’re doing a, um, what do you do if you’re a newspaper guy in the bar world?
David Cleverdon 1: Welcome. And he’s from Ohio. Okay. And I, I forgot your name already, but welcome. So I’m going to tell you briefly, because a lot of you know, us, uh, about three 60 immersive. We started in 2001. We’re basically a production marketing company. We run around with cameras, do we do APP development, web development in early 2015, late 2014. In fact, when the gear vr first came out, we looked at vr and we said we have some clients that this might apply to and we did our first project which was a recruiting marketing project for Boise State football. So the coaches running around with a set of goggles and how do you connect kids that are in Texas and California and Kansas to want to come to boise when all they’ve heard about is maybe potatoes and blue turf or who knows what virtual reality was the thing back there and it still is today. So it’s all about creativity, innovation and technology. We love our toys, as you can see. We brought some of them with us. We’ve focused and developed a passion for education, for training, for learning and how this technology can affect you in those regards.
David Cleverdon 1: We do work for law enforcement, fire ems, workplace safety, Osha related topics, and it’s a very satisfying field because we want to drive fatalities and injuries to zero so we can walk away at night. When you say we hopefully help people accomplish something good. Hey, what does vr do at trains in a more meaningful way? You can develop a more confident and knowledgeable workforce. So I’m going to run through these fairly fast so we can get to the meat of the conversation. Create a culture of safety accountability. It helps that because we’re increasing retention, we’re increasing engagement.
David Cleverdon 1: Bottom line trainers are always looking for a bedlam better bottom line. And uh, like I mentioned, we want to drive fatalities and injuries to zero. So you all know this, but in case you don’t, and maybe you don’t, there’s two general types of immersive technology. There’s computer based and there’s video based and sometimes there’s a mixed reality which is kind of a combination of both. And so there’s specific advantages, you know, in Cgi you can, if you can imagine it, you can create it within reason. Video is not so much. You have some limitations. It gives you a high level of interactivity. You have a, if you choose, you can code it so that you have six degrees of freedom. You can move around in that virtual space wherever you want to go.
David Cleverdon 1: All you guys back there, I mean standup coders, I, I commend you. You didn’t stand up, but there he is. In our case, dangerous scenarios can be simulated. We can put somebody in a trench that’s improperly shielded and guess what? That trench collapses and bad things happen. We can flip over a tanker truck and you wouldn’t do that in real life, but we can do it in a simulated environment and we can teach and we can train and we can educate. You know the. There are some things that gamification may not be for everybody. A lot of organizations who aren’t real video, they weren’t real people and we’re going to get to that. Another thing was CGI is that it takes a high degree of knowledge and edification. You have to go out and whether you’re unreal or whether your unity or even Marcus’s platform, it takes some degree of knowledge and edification where you can go out and buy a $120, three 60 camera, plop it on your phone and you could start shooting 3:16.
David Cleverdon 1: We’ll get into that. Now in case of Marcus’s platform, his platform is actually designed to take a lot of the the pain and anguish shot of learning unity or learning unreal. We’ll talk more about unity because we’re on that side of the fence, so distribution can also be a cha challenge. If you go out today and you popped your phone up or your little three 60 camera ball on top, you streamed it live to facebook. You’re producing, you’re, you’re running content right now, and we could teach you to do that. If you don’t know how in 10 or 15 minutes, CGI based takes a little bit longer than that.
David Cleverdon 1: So what’s the great thing of rebel? Real video? Guess what? It’s real people. It’s you and it’s you. You’re kind of bored today, but what you do do your hand like that. Okay. Don’t do it again. You have a high level of detail in the scenes. You’ve got. You’ve got real. Marcus has got a stain on his shirt, will see that you have processes and procedures and tactics that you can see in specific detail because it’s as real as it gets because it is real. The only difference between that traditional videos, you’re actually in the middle of it. You feel like you’re immersed and they’re in the scene versus passively watching it. What are some of the challenges? Well, frankly it’s on the rail. If I move the camera over here, even though it’s a three 60 environment, I’m the viewer goes with me in a computer based scenario, the Bjork and walk around wherever he chooses to. If it’s coded that way, it’s interactively challenged. Yes. If we pull video into a unity environment, we can add hotspots and we can add secondary content. We can do a lot, but it takes a high degree of technical ability to be able to do that beyond just going out shooting and producing three 60 video.
David Cleverdon 2: Okay.
David Cleverdon 1: A real plus, as I mentioned, $120. We’ll put you in the game so you can go out instead of buying that nice dinner in that glass of wine and you can, you can buy a three 60 camera and start shooting today. He’s a distribution. You can push it out to facebook, youtube, there’s a number of different platforms you can, you can, uh, if you want to show your friends on your laptop, you can load up their free go pro player or blc now supports three 60 video for our video in and you’re off to the race. So they’re each different type of technology, pluses and minuses and depending on your need, Dan, ambition and dedication to the technology.
David Cleverdon 1: So we started, like I mentioned a almost four years ago, and, and a lot of these cameras were a partner, maybe I don’t, you know, they felt like they were duct taped together and then a Gopro came out with the, uh, it was google with the Odyssey. So there’s 16 gopros strapped together. Uh, we had, we bought a off brand of one of these, but we decided that by the time we did the 10 unit here, which is a, our first real production camera, that stitching together 10, four k video streams was hard enough and stitching together 16 video streams was something we didn’t want to get into. So, uh, we actually, in the earliest, we had six camera, we had four cameras, we had two cameras, um, but they’re all gopro based and a lot of people started that way.
David Cleverdon 1: We bought Nikon, somebody mentioned Nikon. Yeah, the dynamic range. And those little icons were horrible. I mean, it, they, they were compact and, and all of that and better than what we had before, but it was, it was bad. So uh, in conjunction with different cameras you have, we can hang them off a drones, we can hang them off a rig that we built for the sport motorcycle and of course you have live stream so there’s 10 times more cameras on the market today than is represented here and it’s, you can have every variation. We’ll touch a little bit on the decisions to make when choosing a camera. But I just wanted to give you something that we started with which essentially rubber bands you get when you’re looking at camera purchase decisions. Are you a consumer base you just want to play around and I mentioned the $120 camera or is it pro-sumer you want to, you want to kind of get there and maybe make, may build a business center or are you wanting to build a business and you need to step into the professional level.
David Cleverdon 1: So those are probably one of the first questions you have to ask yourself. Budget and applications. Are you looking at doing training? Are you looking at shooting music videos? Are you looking at shooting family weddings? I mean there’s, there’s so many applications and each one can have a camera that’s more or less tailored to that and they could be different in pricing and cost. Um, so action Cam Action Cam, like that little tiny camera shoots, five point seven. Great little camera we’ll get into a little later on, more of a professional grade, something like that. Not as easy, easy to haul around, but it shoots a lot better. Looking picture comes back to resolution. A lot of little cameras or sub four, k four k sounds like a lot. Four K is a lot. When you’re looking at an image like this, four k is not so much when you wrap it all around. So the better the experience your viewers have, the more that your content has impacted meaning or k five slash seven. I mentioned that these little cameras shoot five point 7,000, five point seven K in pixels. Um, today I probably wouldn’t buy unless you’re just looking for something to play with. I wouldn’t buy a four k camera. I’d look for something a little higher and these run about six, $700. Sounds pretty inexpensive.
David Cleverdon 1: Eight K and above that, that camera shoots eight k and the biggest problem with processing eight k video at 250 meg data rate is that it appealed to us about a computer out there. So usually you want to shoot it, the highest resolution and then you’ll down resident to be able to work with it. So shooting resolution versus distribution. I mentioned shooting at eight k absolutely stored on the hydro shoot, eight K if you’re going to do a something for the vibe or if you’re going to do something for the riff, you want to run it at four k if you’re going to go out to mobile and that mobile has to deal with uh, an iphone six is the least common denominator. You’re going to run it to kick or maybe even a little less.
David Cleverdon 3: Are you going to talk more about queuing device’s current state or something? And I’m eating up right now. The guy go run. I’m just saying if you’re going out to Ios or android and you don’t know what device that you’re going to be playing up, we run to the device. It runs. We don’t have a device that htc vive I’ve never tried. We haven’t tried to buy. Let me tell you. That was great. I mean it came as a little is a lot for the resolution of the human eyes at that take your eye only sees one k anyway, so anything more you’re sending, you’re talking about spreading out your. You’re seeing just a fraction of that. So please in any part of this discussion, jump in this discussion. I’m just running through my crib notes here. Really what it comes down to. Just to kind of get the conversation going and I know Marcus has a lot to say. I even have a questionnaire on on that point. Can you somehow this ride called four k, looks like I have a hard time finding good words. So for me, four K in a bike looks blurry. There is a significant level of fairness. It looks awesome, but the moment you focusing on anything, it looks a little bit blurry. Do you have like two k versus four papers says ak in an audit around. How would you describe this? Typically?
David Cleverdon 1: Well, there’s more detail, you know, anytime that you go through compression where you go, are you sure that given resolution you’re making compromises and sacrifices that, that can be color and there’s a whole color science and encoding and h two 64 and [inaudible] 65. And, and in fact I was just actually looking at. We did a test score, uh, a gentleman that were kind of some forensics kind of stuff and he was more interested in resolution than anything else. And eight k looks better than four k if you’re just looking around and experiencing it, you probably won’t notice if you’re, if you still it, you’re going to see a difference. Uh, probably what has the biggest difference is the device that you’re looking at, you know, what’s the resolution on an iphone five or a six versus what’s the resolution on a oculus go? I think the oculus go is an awesome piece of hardware. It’s all going to get better, you know, which just, you know, what we’re talking about today is going to be old news a year from now. Price comes down, oculus go, we used to recommend doing it or Samsung Galaxy six or an iphone six and buy refurbished. So if you’re buying a lot of them like 30 or 40 a classroom and these little Bobo Vr, which are about $25, but he at that amount you’re paying $225 to 2:30 and you can buy a, a goal for one 99 and it’s designed for Vr and you don’t have to kind of keep the phone and the headset together and it’s.
David Cleverdon 3: I just want to add, I think, uh, I could imagine like in a virtual world yet, like a magnifying glass. I’m looking over here at this drone right here and you can see all the tail on it and stuff like that. If I have like a little magnifying glass with the eight k from it and you’d see all the little letters and you can read all the letters and stuff like that. Yeah. If you guys ever want to play with stuff, we can send you some eight k video. Just send me an email and it’s pretty big. Yeah, it’s there. I can just distinguish all the pixels. The pixels are just 8,000 of them are black.
David Cleverdon 3: Yes, it’s actually totally, but it’s not any. That’s really, really interesting. Challenge for the ivory seed. Get the similar graph guys on board and get one of their top secret eightK devices. Do exactly the same video on those versus an htc vive classic their life. And let us look through those. I would love to see an eight K I’ve heard only from people who saw it and they go like, it’s such a difference to fourcade and I’m like, give me words to describe this, but I would love to see this. Have you seen it? No, I’m just giving you the world.
David Cleverdon 1: So last screen door effect is what I’ve heard. You know, you don’t see the little pixels but you know, that affects um, yeah, I’d love to see when, if you get one, let me know.
David Cleverdon 3: Yeah, for sure. We’ll send it again. Oh, street. Yeah, I don’t know, but this camera will shoot tribute at eight, so I don’t know as far as the Google cars, you know, when you see them running around the thing on top, I don’t know. This one’s certified so it’s, it’s at least a cake because you can, they’re taking all of that. All right. Any other thoughts? Is An unrelated, weathered. You can’t put this in that about the uh, I got one. How about stitching together? What kind of software do you use? Stitch.
David Cleverdon 1: Are you always the guy that jumps ahead in the powerpoint? Archie. Okay, we’ll get there. Yeah. Yeah. So that’s another thing when it comes to resolution. I mentioned that this camera shoot eight k at 250 meg. Um, when we go out for mobile where it was at a two K at 20, are we what? Yeah, because if it works well and it seems to be a good compromise and we also have to look at if people are downloading the image, you know, video, you have to take that into consideration. So there’s always compromises in this business is if you could run it, have something to play off of that camera, the image would always look good, but you’re probably not going to play it on that phone right there.
David Cleverdon 1: Okay. So in the old days I mentioned that we used to take 10 gopro cameras and a rig shoot them, try to get them all lined up, stitch them together, and that was your image and a lot of the, in fact, a lot of the imagery in our demo app is that kind of footage. It’s labor intensive tasks. You don’t have to do that anymore. A lot of cameras will self stitch. Now you’re making a compromise and resolution usually at about four k or even a little bit under four k. If you’re going to self stitch. So at that resolution, including the data rate works for you, then you won’t be stretching. But let’s say that you want to shoot at a high rez and you’re willing to post process, which doesn’t necessarily mean that you have to stitch it. You’re just running it through the processor and the camera based on its internal template and the software on your system.
David Cleverdon 1: We’ll stitch it for you and it may not. If you go out and shoot a a good chunk of video, it may take overnight to do, depending on how much horsepower you have having the computer, but at the end of the day when you come in in the morning, you have some pretty nice video at eight k or you can download the five k, which is what we do are four k depending on what your system will handle and then edit according. The Nice thing is you archive the video at eight k and in in future needs. You have the footage available at a higher resolution. When phones can easily play four or five, six, seven, eight k video, what’s the day will come. I can’t tell you when, so that stitch or not to stitch a zoom, they have an Omni Gopro. Romney people went out and bought a go pro on these. It was kind of that transition thing. You’ve got to stitch that a national inner agency fire. Those folks up there bought an Omni and they go, man, you got to stitch this stuff. That’s just an example of a camera that, you know, it kind of was on that cusp between real time stitching and you’ve got to go ahead and put it together. So I wouldn’t buy a camera that didn’t stitch, but what we do is that we generally will capture it. The highest resolution will post stitch it and then we’ll edit it accordingly.
David Cleverdon 1: Battery Life, spacial audio, spacial audio. You said you love audio, right? What if your camera doesn’t have spatial audio? I have a slide for that, but yeah, there’s other ways to do a deal with it. Um, this little camera, this shoot, spacial audio. This guy does over here too. How many people use it? But it’s part of the vr experience. A monitoring your camera,
David Cleverdon 1: you know, you’re in the frame. If you’re standing next to you can. It’s like Zach sitting back behind that camera right now. You can’t do that. Sorry. You can do that, but you’re going to be in the shot. So you got to go hide. So how do you do it? Well, most cameras nowadays allow an APP that you can at least monitor a point of view and some of them you can monitor your audio and I’m not. I’m taking too much time with this summer jumped through dynamic range. I mentioned the Little Nikon, which we had a bunch of those before. Those cameras for action cameras. They were good cameras, but the dynamic range was horrible and if you don’t want to, I don’t know what dynamic ranges, its ability to capture highlights and darks and to be able to to image between the two and the higher the dynamic range, the better it gets closer to what we see as humans. The Nikon, if we shot in here, this would be totally blown out. You wouldn’t see any detail at all because it had such low dynamic range. These little cameras actually, they’ll do a pretty good job.
David Cleverdon 1: Okay, I’m going to jump through here. We all know what a tripod is, right? So a typical tripod has as big legs. We like these little things, little legs. So when you’re shooting a three 60 camera, you take a big tripod like that, you put a vr camera on, you see this big chunk underneath it, you see this and it’s pretty easy to paint out, you know, vr stuff. There wasn’t, when vr first came out, there wasn’t these massive, a industrial giants creating vr mounts and stuff. And so you make your own, like this is a painter, pole standard, camera, tripod. And then we purchased different painter polls depending on what we need to make to the legs. And it’s a small footprint for the legs and you can pick what you choose. And same with this, if you notice, this is just literally something you can buy in Lowe’s for $10,
David Cleverdon 2: painful gimbals
David Cleverdon 1: they have, especially for the Action Cam, uh, these small cameras, uh, they’ve come out with a number of handheld gimbals. You’re still in the shot somehow, but at least it will stabilize the footage. We’ll get into cameras stabilization because we don’t use a gamble because these cameras self stabilize. If you post process it. And I’ll touch on that selfie sticks. Who hasn’t used a selfie stick? Who hasn’t seen three 60 video like this? Skiing. Snowboarding. That’s great. Except you’re always looking at. If you turn around, you’re always looking at the guy. It’s just part of it, but it’s a different style of mouth. Tell them my mouth.
David Cleverdon 1: We would absolutely for the first two and a half years, never put a vr camera on a helmet because you make people sick. It’s best practices. This little Garmin self stabilizes so that you can actually put a amount. You can put something like this if you want to set it off a little bit or you can put it, make it a little shorter like this and it works really well. Uh, well you can run around the block with it and it will stabilize to the point that it looks like it’s on a rail. And that’s the garment because it has geometrics data that it understands where it is in three d space and it will stabilize it back. It’ll try to stabilize this. We’ve tried that and it does a pretty good job of it.
David Cleverdon 1: We did a bunch of work for Blm with the a smoke jumper and they really wanted us to put a camera on their helmet and we said, you can’t do it. You’ll make people sick. You just, you know, it’ll be a poor experience today. If we were gonna shoot, that same thing we would could, you can stabilize it. Drones. We’ve got lots of drones here. This guy we keep around because it can pick up a bigger camera. We’d do this little guy you can pull out of your backpack and you got a flying vr camera that if you’re a, if you’re familiar with the maverick pro, the maverick pro is a pretty revolutionary by Dj I amongst other drones by the same company. But uh, we, uh, came up with a way to hang the thing. It’s kind of looks diy and diy, but there you go.
David Cleverdon 1: So how do you take a guy out of the shot? Let’s say you want to do a tour or let’s say you want to, you want to get involved with some team action. And that’s where we built this thing is because we, something that would, you know, when I walk, I naturally go up and down, now I can stabilize this way, but I still have that, that feeling that if you’re running a rover, unless you’re running it through the Sagebrush, that you’re a lot smoother. And so it’s a lot easier to do a deal with that. So if you’re doing facility tours, if you’re training, if you’re, if you’re doing anything that you want to dynamically move that camera with the team or with the process or the tactic that’s being portrayed, that’s what the Robertson. And this is just the fifth scale Dune buggy that we stripped down and built back up. So again, for us, this whole world is a little bit of diy, but there’s nothing wrong with that because you see a need and you build it,
David Cleverdon 1: you know, uh, I mentioned motorcycle, we do a lot of motorcycle training and so we built an amount that actually hangs the ball out here because you can turn around and see the, the, uh, writer. But you, more importantly, you can look down and you could see his hands, you can see the gauges, you can see the tilt, you can see what he’s doing. We tried cameras behind him. It didn’t work out. All you did is see him in the back of his helmet, on the handlebars. That didn’t give the feeling of writing. But if you put that camera right here, you actually feel like you’re riding on the bike. We’ve never done a cable cam. I know people have and I’m sure that you will see br cable cam soon to on ESPN. But uh, it’s just the same way they run a cable cam. Now you’re going to see it in Vr and who doesn’t like to go diving with you? You mentioned. Can you take them underwater? This is good to like 160 feet. Any other questions on mounts by the way? Uh, you end up collecting like just boxes and just stuff. If you’ll look over and there’s a whole box of this kind of stuff because you’re attaching cameras to cars and to just, you just end up needing it.
David Cleverdon 3: Yeah. One of the other speaking or cable cable can now, do you want to launch it on facebook? You know, I’m standing rug under the condition said the x games money, cable cancer, three 60 video. So if you want to watch it on facebook and I don’t know what to is what he
David Cleverdon 1: said, but I think it’s cool. I mean I am, I’m right under here and I can’t hear very well. Anyhow, so spiritual audio. I’m gonna keep running through this because uh, so, uh, you can decide whether you want to shoot spacial or stereo a zoom. H Two n is a, an older to channel a recorder like there. That’s a six up there on that camera, but it’s got some firmware. If you want to shoot spatial you can, you can take a camera that doesn’t necessarily shoot spacial audio and record a wild and you can, you can made it with it. Um, yeah. How do you handle your talent? How do you make your talent? Well, you can’t in the traditional way. You can’t take a boom mic and an audio guy and he, because he’s going to be in the shot, so you know, that’s something to consider when you’re setting up your scene. How do you monitor audio? Some of the apps will allow you to monitor audio or you can actually take a feed off one of the, uh, audio interfaces and in wireless we haven’t set that up yet, but it can be done.
David Cleverdon 1: Everybody know what recording wild means. You have a separate recorder versus the camera. You cooperate, you sync them up and post
David Cleverdon 2: event event is tough
David Cleverdon 1: if you’re trying to run off of Mike and you got an air conditioner and you got a roomy noise and you get all this stuff going on. Uh, if you can take a feed like I’m running a wireless lab, but if you can come off from a board, um, you, you’re going to get a lot cleaner. I think I’d almost rather be cool than not here. Live streaming. Live stream. Anybody live stream? Yeah, there you go. How about three 60 live streaming.
David Cleverdon 2: Okay.
David Cleverdon 1: That camera doesn’t really. Well, uh, if you get again, run through an audio interface and, and manage the audio, that’s one of the biggest challenges. Yeah. In this camera we can, we can only go out to one at a time. So usually it’s facebook.
David Cleverdon 2: Yeah, Youtube lite. Yeah.
David Cleverdon 1: Okay. Uh, we mentioned you got to hide. We’ve got all kinds of different. Where do you, where do you take your camera? Where do you put your camera? If you’re, if you’re blocking out your talent, uh, you know, people think about three 60 video. They think that, well, people all the time are looking around. Well that’s really not the case because most of the time I’m looking about what’s kind of in front me. I may look back and forth. And so when you think about your point of view, like I want to talk about newspaper guys that have imported themselves from Ohio to Boise, Idaho, I’m going to be right here. I’m not going to be over here and then have to have you look. Likewise, if you happen to have a point of view on different shots in the editing process, you want to shift that video because you can, you can shift the point of view and so that I’m not constantly going here and then the scene cuts and then I’ve got to look over here.
David Cleverdon 1: You want to kind of manage where the viewers looking, but remember for the most part, even in their inner three environment, they’re going to look forward most of the time. So what does that do for you to allow you to manage your shot’s better? You don’t necessarily have to worry as much about what’s going on all the time behind unless it’s importance and if it’s important, then use something to draw the viewer over there. And audio cues are generally a good way to do it. Something Goes Bang or pop or somebody yells that you know, get their attention over there or shift the scene. So it’s over here, they’re looking here and then the next cut you want to shift it into overseen over so they don’t have to do the ping pong, a blocking your town lighting the scene. We’re shooting vr route that right now and I wanted video lights in here. What do you think? I’m going to see video, right? So you can hide them and try to hide them, but it’s challenging. Sometimes you just gotta go with what you got or get really creative.
David Cleverdon 1: Moving the camera. Like I mentioned, there’s all kinds of different ways of moving the camera and putting it on somebody’s head. I can take this selfie stick right here, this little guy and I can walk around like that, which we’ve done a lot in the early days. You see the top of my head if you look down, but you can put a cap on, you know, you can go a little branding. Conoco cover up my hair, shooting in a vehicle. We had a lot of mistakes shooting in a vehicle. We uh, you know, you have, you have bright sunshine outside and didn’t have relatively dark interior from a, like a patrol car. And so we’ll usually take two little led panels and put underneath the camera kinda kicks up and brighten, sit. And then we dropped the cameras. Two stops. So it, it doesn’t, when it’s trying to adjust whether to go inside or outside that we’ll see detail in the windows, but we’ll still have enough detail that we can bring it up in, post in the interior of the car.
David Cleverdon 1: So shooting, shooting vehicles is fine. You get, you know, especially if the guys like to do 100 miles an hour. We shot racecars up firebird at 2:40 and it, vr is great for that because nobody ever gets to do one or 240 miles an hour for six seconds except the driver. Now you can experience at this, uh, first time we tried this, we crushed it because I’m mounted this, I bolted that to that. So, uh, this thing swings a little bit and it starts to get worse. And if you don’t have the flexibility in being able to have them balanced independently, it gets worse and it gets worse. Then suddenly it ends up down. But once you, once you put a joint in there so the drone could sell stabilized and the camera can swing freely and you can stabilize this footage. So it looks pretty good. You can take a little drone in the backpack and a little camera in the backpack and go shoot all the three 60 foot each one.
David Cleverdon 1: I have great footage of that. By the way it goes as it crashed. Uh, if you’re shooting, it’s sometimes good to get, especially if you’re shooting from a tripod, take your phone and shoot some textures of the flooring because if you want to paint it out later on that you might use that, you know, Mocha. If you, uh, if you track your footings, let’s say I have the rover driving like this, I can grab footage in there and Matt and Amanda it in here, in that rover just goes away. But sometimes you know, it takes you two seconds to grab a piece of texture and it, it might be beneficial camera audio. Just do the pay attention to audio because if you have great video and really bad audio, it’s a bad piece. If you have great audio and kind of marginal video, you know, most people get along with that. Audio will always create a worst presentation than bad duty shop management. Uh, take notes, clapper your audio if you’re going to shoot at wild, um, file management more than anything else. How many times do you have all these different shots and all these different files floating around and if you don’t manage it correctly, you just created a nightmare for yourself, especially if it’s a big shoot.
David Cleverdon 1: Yeah. Alright. Stitching in the old days, you know, like I mentioned, hey, we had many, many cameras. You have to sync them all up and you got to stitch them all together and it was a labor intensive process mean it took hours and hours and hours to make a scene. Make one sheet today. You don’t have to do that. Yeah. Really rts real time stitching or post-processing or both. Some cameras shoot. If you tell it to realtime stitch, it’ll give you that realtime stitch and then it’ll shoot the, uh, High Rez, the individual files so you can stitch them or process them later on, which is the best of both worlds. Camera Utilities, most cameras now come with a whole suite of utilities that you can use that make your job easier. They either do the stitching for you, do the processing, sumome do light editing, you know, we haven’t used adobe as far as uh, for our editing, but some of them you can, you can get along with it. Stabilization and Horizon Correction, this is, this is an amazing little camera because it captures data from both accelerometers and gps and the uses that to stitch and correct. In fact it can do overlays so I can get acceleration. I can get speed, I can get distance, I can get a map overlay if you want to overlay it on google earth or Google maps all from this little camera.
David Cleverdon 1: So back in the old days. And we, we have in the last year used video stitch once we had both seats per video stitching and a color which is auto panel, video, video. And like I say, it’s labor intensive but if you’re looking for that fine tuning that maybe your camera can’t do from its software, maybe you want to jump back into that. We had one situation where we had a stitch line that we just hated and so we went back and fixed it. A, we use adobe premiere and after effects has a whole suite of vr tools down. So, uh, I’m sure that a apples come up to speed and I don’t know what you all use or what you want to use, but we use adobe, uh, managing the point of view. I brought that up before where you can actually shift the viewer’s point of view so they don’t constantly have to be bouncing back and forth.
David Cleverdon 1: You can manage your video shots so that they have a kind of a smooth experience and where they’re looking to video in a three 60 world, wouldn’t it be great if you can put the detail of how to take this kink out of this hat right here that he sat on accidentally and put it in a 360 environment where you can do that. It’s the post process. You can take detail video which is your phone video or video off of that camera of a process or procedure and lay it into a three d environment and, or instruction. A shot of me up here talking for way too long and I’m laying this into a three world and so I get that instruction of an individual and I can put it into a three 60 content, so you marry the two technologies, lower thirds graphics animation that allow a richer environment.
David Cleverdon 1: All of those, just as they go into traditional video, they’re going to three 60 video. Same with editing and transition. There are vr transitions that folks have come up with a. We don’t use them much, but they’re out there. They’re in premiere and I’m sure you can buy packages. Same with the facts and color correction, so we talked about output resolution. Data rates are not going to cover that. A distribution, how do you distribute this? Guys pretty much know this stuff. It’s either on mobile or a web gl. Web Gl is still a technology that needs to be polished out a little bit, but it’s different pipelines. Once you create your content, push it out there in different ways. Put it out on facebook, put it out on Youtube, put it out, build a mobile APP and put it out. Show your friends anyhow, I hope I haven’t drone to drone. That was kind of making that a player worse, but I didn’t see too many people left. Okay. Any hour, any questions? Any thoughts? Was it beneficial? Yeah. You guys need more detail and in things like that? Yeah. Just go to uh, uh, look for uh, well it’s, what’s it, what’s that thing called? Check? We have about 10 or 12 of them out there, but the portfolio APP is.
David Cleverdon 2: Yeah. Yeah.
David Cleverdon 1: So who all was thinking about shooting video versus maybe coding it or coding it? Who wants to shoot it? Or did you guys just hear about this
David Cleverdon 3: else to do? I have a question that says what I really want to use it to video, but I’ll give you an example of where you’re going. Well, you know, people say, well, once she became so all those things, so, so how do you convince somebody that doesn’t or how do you convince them otherwise?
David Cleverdon 1: So here’s, here’s the different, here’s how we look at it. There are great applications for three 60 video and especially in, in circumstances that are situationally aware. I’m looking around this room and I can see each one of you. I can look behind me and look at this panel. There are applications that you’d go, boy, it doesn’t really feel. It feels like I should just take a video camera, take my phone and shoot this, but the one difference that appeals to a lot of businesses and more and more is that it’s a connection and it’s different. If their people are continually used to looking at video and they’re. They’re did, they’re saturated with it. It doesn’t seem to be having any effect anymore, but it’s a topic like we had one group that at Colbis in call center folks. Okay, so call center. I mean it’s like a bunch of people talking on the phone.
David Cleverdon 1: How could that be an application for three 60 video, but their interest was that you could build empathy for not only the person dealing with the guy that’s mad on the phone, but also putting them in their shoes and understanding why they’re upset, why they’re mad and so it’s not necessarily because it’s a visual aspect. It’s because of a connection that is different than traditional video or powerpoint slides or pictures or anything else, and another one is the army is looking at an application for dentistry. Dentistry. Okay. It’s right there. Yeah. I mean they’re doing that. Well. They’re. They’re issue is contamination and hygiene and and all the things that we don’t even think about that dentist in general, but in this case, and it is that connection. We have an APP that teachers folks about understanding schizophrenia. Well in this case there’s a real application because we can actually put somebody in the place of the experience, what people suffering from schizophrenia, the color shifting, the blurred vision, the voices, the, you know, everything’s all around them and it’s irritating to nagging and then we shift back to a law enforcement point of view and so to help them understand that this is, even though you don’t understand that that is their reality.
David Cleverdon 1: That may be understanding that their reality is going to be different than that law enforcement personnel. That maybe the when they get aggressive might be a little different than what they’re trained to do. Is, is they’re trained to do so. Vr can give you a different reality. I mean, we think vr is kind of cool, right? Everybody wouldn’t let me. We’re all here. Right? You guys have experienced alternative realities since you grew up. Who? Who’s trying to book anybody? Josh. Oh, Josh. No. Okay. Who’s seen a great movie that you were so involved with that you just couldn’t understand that, that the movie’s over because you were so tied into the plot television. All of these things are alternative realities in things that we think are really commonplace, so technology today gives us the ability to put ourselves in a position to understand circumstances. It’s in training or entertainment or gaming, but also the build anthony and connection and understanding about somebody else’s world.