James Murray from the Impractical Jokers talks about the future of VR

James Murray is a funny man. A producer, actor, and writer, Murray is best known as Murr on the show Impractical Jokers. I spoke to him for a Technotopia interview about the future of TV, VR, and media and he has a lot to say.

His dream? To offer immersive experiences to his audiences using VR, a dream that he thinks is still far off. Until the VR experience is out-of-the-box easy, he said, there isn’t much hope for the medium. He’s a funny guy and this is one of my favorite interviews.

Technotopia is a podcast by John Biggs about a better future. You can subscribe in Stitcher, RSS, or iTunes and listen the MP3 here.

Rate This!

Robo Wunderkind wants to build the Lego Mindstorms for everyone

Lego Mindstorms have paved the way for many programmable toys. And Austrian startup Robo Wunderkind is building a new kind of Lego-like programmable kit. The startup first launched on the TechCrunch Disrupt stage and just raised $1.2 million (€1 million) from SOSV, Austrian Federal Promotional Bank and multiple business angels.

Compared to many programmable toys out there, Robo Wunderkind is still a Lego-like building kit. This is key as too many toys forget that it’s fun to build something with a few bricks.

Robo Wunderkind also has special blocks to turn your dumb robot into a connected one. In addition to the usual sensors, such as proximity sensors, motion detectors and light sensors, the company also has some more sophisticated ones. You can put a tiny camera in your construction, use an IR blaster and receiver and program a tiny LED screen.

But the best part is that Robo Wunderkind also sells Lego adapters so that you can put together a sophisticated robot that uses both Lego bricks and Robo Wunderkind modules.

The company has two different apps in the store. The first one called Robo Live lets you control your robot in real time. The other one Robo Code has a brand new user interface and now detects the blocks you’re currently using.

Robo Code is where Robo Wunderkind shines because you can put together simple algorithms by arranging virtual blocks in the iPad app. It’s a good way to introduce a kid to conditional statements and loops.

You won’t build a robot as sophisticated as a robot built using Lego Mindstorms. But Robo Wunderkind seems more accessible and good way to try robotics before switching to Arduino and Raspberry Pi when your kid grows up.

The company successfully raised a little less than $250,000 on Kickstarter back in 2015. You can now buy a starter kit for $250. Advanced and professional kits will also be available soon.

Rate This!

Massterly aims to be the first full-service autonomous marine shipping company

Logistics may not be the most exciting application of autonomous vehicles, but it’s definitely one of the most important. And the marine shipping industry — one of the oldest industries in the world, you can imagine — is ready for it. Or at least two major Norwegian shipping companies are: they’re building an autonomous shipping venture called Massterly from the ground up.

Massterly” isn’t just a pun on mass; “Maritime Autonomous Surface Ship” is the term Wilhelmson and Kongsberg coined to describe the self-captaining boats that will ply the seas of tomorrow.

These companies, with “a combined 360 years of experience” as their video put it, are trying to get the jump on the next phase of shipping, starting with creating the world’s first fully electric and autonomous container ship, the Yara Birkeland. It’s a modest vessel by shipping terms — 250 feet long and capable of carrying 120 containers according to the concept — but will be capable of loading, navigating and unloading without a crew

The Yara Birkeland, as envisioned in concept art.

(One assumes there will be some people on board or nearby to intervene if anything goes wrong, of course. Why else would there be railings up front?)

Each has major radar and lidar units, visible light and IR cameras, satellite connectivity and so on.

Control centers will be on land, where the ships will be administered much like air traffic, and ships can be taken over for manual intervention if necessary.

At first there will be limited trials, naturally: the Yara Birkeland will stay within 12 nautical miles of the Norwegian coast, shuttling between Larvik, Brevik and Herøya. It’ll only be going 6 knots — so don’t expect it to make any overnight deliveries.

“As a world-leading maritime nation, Norway has taken a position at the forefront in developing autonomous ships,” said Wilhelmson group CEO Thomas Wilhelmson in a press release. “We take the next step on this journey by establishing infrastructure and services to design and operate vessels, as well as advanced logistics solutions associated with maritime autonomous operations. Massterly will reduce costs at all levels and be applicable to all companies that have a transport need.”

The Yara Birkeland is expected to be seaworthy by 2020, though Massterly should be operating as a company by the end of the year.

Rate This!

Under a millimeter wide and powered by light, these tiny cameras could hide almost anywhere

As if there weren’t already cameras enough in this world, researchers created a new type that is both microscopic and self-powered, making it possible to embed just about anywhere and have it work perpetually. It’s undoubtedly cool technology, but it’s probably also going to cause a spike in tinfoil sales.

Engineers have previously investigated the possibility of having a camera sensor power itself with the same light that falls on it. After all, it’s basically just two different functions of a photovoltaic cell — one stores the energy that falls on it while the other records how much energy fell on it.

The problem is that if you have a cell doing one thing, it can’t do the other. So if you want to have a sensor of a certain size, you have to dedicate a certain amount of that real estate to collecting power, or else swap the cells rapidly between performing the two tasks.

Euisik Yoon and post-doc Sung-Yun Park at the University of Michigan came up with a solution that avoids both these problems. It turns out that photosensitive diodes aren’t totally opaque — in fact, quite a lot of light passes right through them. So putting the solar cell under the image sensor doesn’t actually deprive it of light.

That breakthrough led to the creation of this “simultaneous imaging and energy harvesting” sensor, which does what it says on the tin.

The prototype sensor they built is less than a square millimeter, and fully self-powered in sunlight. It captured images at up to 15 frames per second of pretty reasonable quality:

The Benjamin on the left is at 7 frames per second, and on the right is 15.

In the paper, the researchers point out that they could easily produce better images with a few tweaks to the sensor, and Park tells IEEE Spectrum that the power consumption of the chip is also not optimized — so it could also operate at higher framerates or lower lighting levels.

Ultimately the sensor could be essentially a nearly invisible camera that operates forever with no need for a battery or even wireless power. Sounds great!

In order for this to be a successful spy camera, of course, it needs more than just an imaging component — a storage and transmission medium are necessary for any camera to be useful. But microscopic versions of those are also in development, so putting them together is just a matter of time and effort.

The team published their work this week in the journal IEEE Electron Device Letters.

Rate This!

Virgin Galactic successfully tested its rocket-powered spacecraft today for the first time since 2014

Virgin Galactic took to the skies today for the first test of its rocket-powered spacecraft in over three years. The SpaceShipTwo launch platform deployed the USS Unity at a set altitude where the space craft will fire its engines for as long as 30 seconds bringing the craft to 1 1/2 the speed of sound. This was the first powered test of the Unity since the SpaceShipTwo Enterprise broke up during a test flight in late 2014.

After the accident Richard Branson’s space program reworked a lot of components but as of late ramped up testing including releasing the Unity for glide testing.

https://platform.twitter.com/widgets.js

For today’s test two pilots — Mark “Forger” Stucky and Dave Mackay — were at the controls of the VSS Unity as its dropped from its mothership. Unlike the original SpaceCraftTwo vehicle, the Unity is built by The Spaceship Company, a subsidiary of Virgin Group, which is also building two more spaceships for the space company.

Virgin Galactic has yet to announce target altitude or speed for this test. This is a big test for the company and it has been relatively quiet about its existence — a stark difference from Elon Musk’s SpaceX .

Update: Richard Branson just released a bit of info minutes after the flight.

https://platform.twitter.com/widgets.js

Virgin Galactic was founded and so far existed to provide a reusable platform to reach sub-orbital altitudes of about 68 miles above the Earth. It’s capable of carrying passengers who are expected to pay around $250,000 for the trip and today’s showed that the company is back on the track to be a viable space delivery system. It’s unlikely the company could have survived another fatal disaster.

https://platform.twitter.com/widgets.js

Rate This!

Smartsheet co-founder’s next project is a robotic rock picker-upper

A co-founder of Smartsheet, the enterprise collaboration startup that just filed for an IPO, is taking a hard right turn into the world of agriculture robotics. Brent Frei tells GeekWire that he has been working on an automated system for clearing rocks from land. It’s a bit unexpected, but far from a bad idea.

While doing a little farming work with his kids last year, including the less than stimulating task of picking up big rocks and throwing them in a tractor-trailer, it occurred to him that this was precisely the kind of thing that an automated platform would be good at.

There are some semi-automated solutions, but nothing simple enough that you could just plop it on a few acres and tell it “go grab all the rocks this big or bigger.”

Why not apply to this all the tech that’s going into watering, growing and picking? It seems at the very least he might make something that he himself could use, so he started TerraClear in October to create a “Roomba for rock picking.”

It’s still a ways off even from prototype stage, but it’s a great example of how wide open the world is to new applications of computer vision and robotics if you keep your mind open.

Rate This!

Last march of the Penryns: Intel cuts Spectre fixes for some older chips

As part of its ongoing efforts to patch its systems against the Meltdown and Spectre chip flaws, Intel indicated last month that it would be issuing fixes as far back as 2005’s Yorkfield processors. But in a new guidance document the company announces that many of these older platforms will not receive fixes after all.

Specifically, work has been stopped on Spectre Variant 2 mitigations for the chip generations known as Bloomfield, Clarksfield, Gulftown, Harpertown, Jasper Forest, Penryn, SoFIA 3GR, Wolfdale and Yorkfield. (You can find more specifics at this great list of Intel codenames on Wikipedia.)

Variant 2 is the toughest of the chip flaws to block or work around, so the creation of fixes is nontrivial — Intel isn’t just copying and pasting stuff into a microcode update for each of these.

In the guidance document (PDF), Intel cited several reasons for stopping development on the fixes:

  • Micro-architectural characteristics that preclude a practical implementation of features mitigating Variant 2
  • Limited Commercially Available System Software support
  • Based on Customer inputs, most of these products are implemented as “closed systems” and therefore are expected to have a lower likelihood of exposure to these vulnerabilities.

In other words: it’s super hard, they’re barely supported and few people are using them where the bugs could be exploited.

It’s a reasonable walkback of the scope of Intel’s mitigation efforts, especially when you look at the size of the list of platforms that are having the problems addressed. Still, system administrators may want to cast an eye over their inventory to make sure no chips of these generations get exposed to the untamed wilds of userland.

And for users, the Penryns (Core 2 Duos in particular) were very popular and I wouldn’t be surprised if a few people were still running an old laptop with one — they were in all kinds of things back in ’08. If you’re one of those sentimental types like me that keeps these things around, you should probably avoid doing anything critical on them.

Intel sent along a statement to accompany the guidance, which seems rather redundant with the above, but just in case:

We’ve now completed release of microcode updates for Intel microprocessor products launched in the last 9+ years that required protection against the side-channel vulnerabilities discovered by Google. However, as indicated in our latest microcode revision guidance, we will not be providing updated microcode for a select number of older platforms for several reasons, including limited ecosystem support and customer feedback.

Rate This!

From fungal architecture to shape-shifting robo-swarms, here are NASA’s latest moonshots

The NASA Innovative Advanced Concepts program is perhaps the best place to get federal funding for an idea that sounds crazy — because the program managers think it might be just crazy enough to work.

Researchers making the “Phase I” cut are awarded about $125,000 over 9 months to develop their idea, be it mind-boggling or merely technically difficult. If significant progress is made or the concept is otherwise found to be promising, a second “Phase II” investment of up to $500,000 can be made at NASA’s option.

This year, according to NIAC program executive Jason Derleth, was “especially fierce, with over 230 proposals and only 25 winners.” A significant amount of Phase II awards were also made (you may remember some from last year’s selections).

I’ve collected most of them here with explanations in the plainest language I could summon — click on to see what NASA thinks the future of space exploration might look like.

Rate This!

iOS could detect when you hover your finger over the screen

According to a new report from Bloomberg, Apple could be working on new gestures for its iPhones. In addition to normal touch gestures, iOS could detect when you hover your finger over the screen to trigger some actions.

When Steve Jobs introduced the first iPhone, he spent quite a bit of time demonstrating the multitouch interface. You could touch the screen with your finger without applying any pressure, which was already something new back then. You could also swipe your finger on the screen, use multiple fingers in order to pinch to zoom or rotate a photo.

Starting with the iPhone 6S, Apple also introduced another gesture with 3D Touch. By applying some pressure on the screen, you can preview a photo or an email, open a shortcut menu and more. The iPhone detects multiple levels of pressure so that you can first preview and then open a document.

According to Bloomberg, upcoming iPhones could also detect touchless gestures right above the display. It’s unclear how Apple plans to use those new gestures when it comes to software implementation. This feature won’t be ready for this year’s new iPhones.

Bloomberg also says that Apple has been experimenting with curved iPhones. But they won’t look like the Samsung Galaxy S9 as Apple is thinking about a banana-shaped iPhone from top to bottom.

Finally, Bloomberg confirms KGI Securities’ report about this year’s iPhone lineup. Apple is working on three new devices — an updated iPhone X, a new iPhone that looks like an iPhone X but is cheaper thanks to an LCD display, and a larger version of the updated iPhone X.

The larger version could feature a 6.5-inch OLED display. This number seems insane given that the first iPhone only had a 3.5-inch screen. But people spend so much time on their phone that there should be a market for this huge phone.

Rate This!

Skydio R1 review: a mesmerizing, super-expensive self-flying drone

The idea of a robot methodically hunting you down isn’t the most pleasant of concepts. A metal-bodied being zooming after you at up to 25 miles per hour with multiple eyes fixed on your location seems… out of your best interest.

The Skydio R1 drone seems friendly enough. though. I wouldn’t call it loving or cute by any means, but it really just wants to keep up with you and ensure it captures your great life moments with its big blue eye.

What makes the $2,499 Skydio R1 special is that it doesn’t need a pilot — it flies itself. The drone uses 12 of its 13 on-board cameras to rapidly map the environment around it, sensing obstacles and people as it quickly plans and readjusts its flight paths. That means you can launch the thing and go for a walk. You can launch the thing and explore nature. You can launch the thing and go biking and the R1 will follow you with ease, never losing sight of you as it tries to keep up with you and capture the perfect shots in 4K.

That was the company’s sell anyway; I got my hand on one a few weeks ago to test it myself and have been zipping it around the greater West coast annoying and impressing many with what I’ve come to the conclusion is clearly the smartest drone on the planet.

The R1 has a number of autonomous modes to track users as it zips around. Not only can the drone follow you, it also can predict your path and wander in front of you. It can orbit around you as you move or follow along from the side. You can do all this by just tapping a mode, launching the drone and moving along. There are options for manual controls if you desire, but the R1 eschews the bulky drone controller for a simple, single-handed control system on the Skydio app on your phone.

The app is incredibly simple and offers a wide range of tracking modes that are pretty breezy to swipe through. Setting the drone up for the first flight was as simple as connecting to the drone via password and gliding through a couple of minutes of instructional content in the app. You can launch it off the ground or from your hand; I opted for the hand launch most times, which powers up the propellers until it’s tugging away from you, flying out a couple of meters and fixing its eye on you.

Walking around and having it follow you is cool and all, but this thing shines when you’re on the move and it’s speeding to catch up with you. It’s honestly so incredible to fire up the R1 and run through a dense forest with it trailing you; same goes for a bike ride. It speaks to Skydio’s technology how few hiccups it had in the midst of extended sessions, though by extended session I mean around 15 minutes, as that was the average flight time I got from a single battery charge. The Frontier Edition R1 ships with a second battery, which was a godsend.

When it comes to capturing precise, buttery smooth footage, there’s no replacement for a skilled drone pilot. Even with a perfectly good gimbal, the movements of the R1 are often pretty sudden and lead to direction changes that look a bit weird on camera. Not every continuous shot you gather from the R1 will make the cut, but what’s crazy is that you literally don’t have to do anything. It just follows and records you, leaving you a lot of footage that you’ll be able to pare down in editing.

There are some things I don’t love. It’s too big for one; the company insists that it’s still small enough to fit in a backpack, but unless it’s a backpack that you could also load a 17-inch gaming laptop in, I kind of doubt that. The body feels light and substantial; but the rigidity of its outer frame and its overall size made me a little nervous at times that I was going to catastrophically break it, which was enough to make me consciously leave it at home when I was out on a snowboarding trip.

I’m also a little distraught by the company’s decision to make this purely Wi-Fi controlled over your phone connection, a decision that definitely helps you from losing it, but also kind of limits its core utility when it comes to tracking people who are not holding the phone. I sicced the drone on a friend of mine who was running around a neighborhood area but after he took off in a sprint, the R1 lost the signal and it came to a stop over a street where I was left trying to reconnect and move it to safety as cars zoomed by a few feet beneath it.

For $2,499, it’s not ridiculous to desire some features that also make this more of a general-purpose drone, as well; all of the propellers are there, so it doesn’t seem like it should be a coup to offer an add-on controller that extends the range from a few hundred feet as it currently is.

Not a complaint at all, but I am excited to see the functionality gains this gets from future software updates; namely I think it’d be really to fun to track a pet (it currently can only recognize humans). At one point when it was following me around in a park, it majorly freaked out a bunch of dogs, who promptly started chasing it — and by extension, me. The sadist in me kind of wanted to chase them back with the R1.

The R1 is a $2,499 product with a feature that makes it particularly attractive to the first-time drone user who definitely won’t spend that much money in the first place. In some ways this mismatch shows just how disruptive this tech could be, but in the short-term the targeted buyer of this drone is an extremely tight niche.

For the early adopter who just loves getting the new thing, you’ll be pleased that it actually works and isn’t another half-baked dream on the road to autonomy. If you’re a creator or vlogger who does a lot of solo trips in the great outdoors, this drone could definitely transform how you capture your trips and end up being a great buy — albeit a super pricey one.

Rate This!