Google DeepMind Publishes Atari Q Learner Source Code
DeepMind explored a bit further their Atari-plaing AI and published an article in Nature. Along with the article comes the source code which I ran on my computer to see that it works 🙂 And it does. The code with few changes (to draw the game screen) is available in my GitHub.
Hi,
I have tried running the code in your github, but it certainly doesn’t run as the original readme file suggests. I have tried installing each dependency manually, so now I am stuck in installing Xitari and AleWrap. Would you mind providing a better explanation? By the way, I am compeletely Linux noob, I decided to install ubuntu for the first time in my life just because of this program (I’ve tried running it on windows too, failed again).
Thanks
Hi, there was a missing “qlua” dependency in the installation script. It is fixed now. You can try again, just remove the old installation, clone the repository again and run ./install_dependecies.sh.
If you have torch or lua installed system-wise clear $LUA_PATH before running — otherwise lua will look into your global installation, not the DQN-specific one.
On Ubuntu 14.04 is should run out-of-the-box. Feel free to contact me if it does not.
Hi, Kuz,
I am trying to deploy the Deepmind code, it’s pretty simple to install the depences by merely running the ” install_dependencies.sh”, however, after that(without any error), I try to run it, it says no “QLUA” is found. I am pretty confused about that, do you know to to install qlua? and if you could provide any information, that would be great. Many thanks.
Hi,
the “qlua” issue is fixed now, please try again.
Sorry for the long delay 😛
Hey Im trying to run this on an new Ubuntu VM set up in Azure, it worked on friday on an AWS VM but here it is falling over with an error:
Iteration .. 0
qlua: not loading module qtgui (running with -nographics)
qlua: qtwidget window functions will not be usable (running with -nographics)
qtwidget window functions will not be usable (running with -nographics)
qlua: not loading module qtuiloader (running with -nographics)
qlua: …-Atari-Deep-Q-Learner/torch/share/lua/5.1/image/init.lua:1451: attempt to index global ‘qtuiloader’ (a nil value)
stack traceback:
[C]: in function ‘__index’
…-Atari-Deep-Q-Learner/torch/share/lua/5.1/image/init.lua:1451: in function ‘window’
…-Atari-Deep-Q-Learner/torch/share/lua/5.1/image/init.lua:1402: in function ‘display’
train_agent.lua:98: in main chunk
Could it be some stuff that our lord and master microsoft have benevolently installed on their ubuntu image?
Update: I started up my AWS instance for comparison this afternoon and it is now showing the same error. FML.
Hi Charlie,
I did not try to run this code on AWS, so I don’t have an answer.
Try the list of issues https://github.com/kuz/DeepMind-Atari-Deep-Q-Learner/issues, maybe there is something similar. If not just create one — probably somebody had the same problem before.
And the academy award for Numb-Nut of the year 2016 goes too……. me!
I was running in putty for one and mobaxterm on the other so putty was erroring out as there was no x11 server.
Thanks for the help, now off to bang my head against a wall
On the plus side it looks like MS are releasing the new N-series VMs that get access to some pretty hefty k80s so might be fun to open a free azure account for a month and see how well DM runs against 4 of them… 😀
Just out of interest how long did it take to generate the gif at the top? I’m running on a rather beefy (or at least i would like to think it is) 16 core 112GB server and i’m seeing performance drop massively after about 55000 steps, is that just the cpu vs gpu performance cliff?
Hello Kuz,
How much time will it take to train it on a cpu ?
and for the full implementation it is not necessary that I should use the GPU…right??
Hi,
I did not measure times on CPU, but somewhere in the back of my head I have a number that it will be about 30 times slower. The gif above was generated after 2 days of training, so 2 months on CPU 🙂
Could you estimate the minimum running specs for this? I am interested in getting a new linux system.
16 GB RAM, NVidia 9XX GPU or better, Ubuntu 14.04
Do you think a NVidia GeForce 750 would work well enough?
It should work, but it has only 512 cores and is slower. It depends on how you define “well enough”.
Ended up going with the GTX 950. I am also wondering how to restart the program from a pretrained network. After running it for 5 days, I stopped the program and now the network is stored on DQN3_0_1_breakout_FULL_Y_.t7. Should I simply change the netfile parameter in run_gpu to be “\”DQN3_0_1_breakout_FULL_Y_.t7\”” instead of “\”convnet_atari3\””?
I’ve also tried adding “-network DQN3_0_1_breakout_FULL_Y.t7” to the args parameter in run_gpu but it still runs the network from scratch.