AI is replacing the human need of doing various tasks. Researchers have conducted numerous experiments to test and train the capacity of bots. Recently, OpenAI has trained the bots to carry out complicated tasks in the future. The bots watched 70,000 hours of the Minecraft video to imitate those actions used in the video game later on. This is a huge development and a powerful technique to train bots and machines.
OpenAI to train bots using Diamond Tools
The bot learned the complicated keyboard sequences and mouse clicks used in the video game, which include chopping down trees and crafting tools. If a normal human plays for 20 minutes there would be 24,000 actions. This is the first time that the bot will be able to craft the diamond tools used in these actions.
OpenAI to train bots with Imitation Learning
In Imitation learning, a task is performed via trial and error. It has been a breakthrough technique for AI in recent years. It has also been instrumental in training bots and machines to observe humans performing various tasks, such as playing games, operating a fusion reactor, controlling robot arms, driving cars, navigating web pages, and finding more efficient methods for fundamental math.
In the old approaches to imitation learning, labeling the demo at each step was necessary to simplify the task. On the other side, doing it by hand can be a lot of work and small data sets as well.They hired a few workers to play Minecraft for 2,000 hours, recording keyboard sequences and mouse clicks. This data was then utilized to train the machine.
Later on, this technique was used to generate labels for 70,000 hours. Peter Stone, executive director of Sony AI America, “Video is a training resource with a lot of potentials.”
In Minecraft, players are free to play however they want to. This is an advantage as this can help to train AI. It is also becoming an important testbed for new AI techniques. MineDojo, which is a Minecraft environment with dozens of predesigned challenges has won an award at one of the biggest AI conferences, NeurIPS 2022.
OpenAI bots have used Video Pre-Training to carry out the tasks like crafting planks and turning them into a table, which comprises around 970 consecutive actions. The best results came with using imitation learning and reinforcement learning together. The bot was able to carry out tasks involving more than 20,000 consecutive actions.
Discussions are ongoing about whether bots reciprocate these complicated tasks with perfection or if it would result in a mess. OpenAI has a lot of faith in the power of large data sets alone. Baker who has been part of the research behind the hide-and-seek game along with his colleagues thinks that this approach can make Minecraft Ai better, “But with more data and bigger models, I would expect it to feel like you’re watching a human playing the game, as opposed to a baby AI trying to mimic a human.”