Connect with us

Technology

Teams are now mapping the depths and they are taking home millions in the Ocean Discovery Xprize

mm

Published

on

A great part of Earth is covered by Ocean alone. We have the least idea about what is present at the bottom of the sea. The place beneath the sea will not be undiscoverable anymore because of the Ocean Discovery Xprize, which will help to map all the sea floors very quickly. The winner can take home a total of $4 million.

There is no map for the ocean and this is why if we are finally can get a hold of the map of all the oceans then it will surely be very helpful for everyone. This price of $4 million has been sponsored by Shell. The goal was to create a system which can map the entire sea and finally come out with new findings. The system, which can find the map, will have to work for hundreds of square kilometers of the sea at a five-meter resolution in less than a day. All the methods which exist are not very efficient and they are generally very costly.

This is not usually the case for all types of competitions. This kind of difficulty does not discourage all the competitions. Since the year 2015, the teams have set their focus on working on the system and traveling all over the world to test them.

In the beginning, the team was originally testing in Puerto Rico but after the hurricane which happened in the year 2017, the whole project was moved to the Greek coast. After the selection of the final round, they ended up deploying their crafts in the water of Kalamata and told them to get mapping.

Jyotika Virmani, who led the program said:

“It was a very arduous and audacious challenge.”

“The test itself was 24 hours, so they had to stay up, then immediately following that was 48 hours of data processing after which they had to give us the data. It takes more trad companies about 2 weeks or so to process data for a map once they have the raw data — we’re pushing for real time.”

“Nothing was damaged, nothing imploded,” she said. “We ran into weather issues, of course. And we did lose one piece of technology that was subsequently found by a Greek fisherman a few days later… but that’s another story.”

 

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

The Neural Network detects the if a photo has been photo shopped or not

Published

on

The Neural Network detects the if a photo has been photoshopped or not

Photoshop is one of the most common video editing software available now for professionals. It is a very common practice that people have to manipulate pictures and tweak up faces. Researchers from Berkeley and Adobe have come up with an absolutely new tool with help of which people will get to understand if the images are Photo shopped or not. This tool can tell how the images have been photo shopped and what are the ways that they can simply undo it.

It is very important to understand that this tool will be working only for Photoshop and no other editing application. A tool which will help to understand changes is a long way off, but the tool which has been designed for Photoshop is very smart.

One of the Researchers called Alexei Efros who was just part of the AI Robotics event has begun to assume that a lot of images are manipulated with the help of Photoshop application and for her, the best place to work now is to look for manipulations possible with this tool.

In a test, a set of portrait images was taken and slight manipulations were done to it. Some of the changes made were movement done to the eye or emphasize the smile a bit more. Then both the original and edited versions were fed to the system so that the tool is able to distinguish between both of them.

“We live in a world where it’s becoming harder to trust the digital information we consume,” said Adobe’s Richard Zhang, who worked on the project, “and I look forward to further exploring this area of research.”

There is a paper available which people can read to gather more information about the project.

 

 

Continue Reading

Technology

Project xCloud was bested by Google Stadia in the tests of latency

mm

Published

on

Project xCloud was bested by Google Stadia in the tests of latency

At an event of E3 in the year 2019, Microsoft was giving a chance to try out the “Project xCloud.” It is a service of game streaming which can give a tough competition with Google Stadia. The location of the server is almost 400 miles away. But the shocking fact is that you can easily access the games like Halo 5: Guardians and War 4. It has the capability to run those two games on 60 frames per seconds if you connect an android phone as a controller.

According to a report of Ars Technica, xCloud only can show an input delay of 67 milliseconds. But on Stadia it was like 166 Ms. But if you want to play the Halo 5 game on a console, you can get an input lag of 63 ms.

According to a Google stadia:

“ In our video tests, the time between tapping the A button and seeing a response on the smartphone screen took sixteen frames of a 240 FPS video or 67 ms across three subsequent tests. That’s almost imperceptibly slower than the 63 ms input latency Digital Foundry measured on the Xbox One version of Halo 5 in 2017… Testing latency of a wired… Stadia demonstration at March’s Game Developers Conference, Digital Foundry found total latency of 166 ms, compared to a low of 100 ms on a 60 FPS PC. “

But we have to know that the test was conducted on less than ideal testing conditions. But it almost has a reduction in the input delay. If you compare that to the previous testing of Stadia, the input lag is reduced; xCloud can easily improve its base.

The same fact is applicable to Stadia. As it was uncovered in March, Google was trying to keep the input delays low. Right now, we are just waiting for the streaming game to start. We should know the fact that Stadia will be live in Founder’s Edition subscribers and the xCloud will be available for a public trial.

Source

Continue Reading

Technology

‘Snoopy’ Lunar Module of NASA was found after 50 years of getting abandoned in space

mm

Published

on

‘Snoopy’ Lunar Module of NASA was found after 50 years of getting abandoned in space

The trip of NASA on the lunar surface took place in the month of July in 1969 followed many other missions. It also included Apollo10. It took part in some mock mission, but it landed on the surface very successfully. There were two astronauts named Thomas Stafford and Eugene Cernan, who flew a lunar module at the time of Apollo 10. The nickname of that aircraft was “Snoopy”. After they complete their task in the space, the aircraft got a shot.

There was no other way for the aircraft named ‘Snoopy’ can return to the earth. It was sent to an orbit which revolves around the sun beyond the moon. But after completing the journey when the astronauts returned to the command module, NASA could not track the trajectory of the spaceship. They started to find that trajectory in the year of 2011. This project was led by an amateur astronomer named Nick Howes.

According to the latest news, they have almost tracked down the vehicle. They will soon confirm its the location which will be preserved by Elon Musk’s Key cultural artifact.

Apollo 10 was one of the fourth crewed missions of NASA’s Apollo program. The mission was to fly to the moon module within the 805 miles along the surface of the moon. Keeping the theme as ‘peanuts.’

the command module mission was called “Charlie Brown.”

The fuel tanks which were used for this mission was not enough with which the spaceship could return. The international limitation which was imposed, according to that as astronauts who were doing the test run could jump the queue and land before Apollo 11’s Neil Armstrong and Buzz Aldrin. This was the only way with which they could create history.

Source

 

Continue Reading

Trending