The Halite III Server Environment


We hope this listing of our server environment, compute and memory limits, and installed languages and libraries helps you navigate the Halite III :cloud:!

The Halite III game servers are run in Google Cloud Compute Engine instances with 4 vCPU and 6 GB of memory. The servers are running Ubuntu 18.04

  • Bots are given 1GB of RAM and are pinned to a CPU.
  • Threads: you may spawn additional threads, but due to the restriction of the game environment, your code will not do extra work in the same amount of time.

GPU-Enabled Bots
For GPU-enabled bots, each bot on a GPU worker has access to one NVIDIA Tesla K80 GPU. The rate-of-play for GPU-enabled bots is limited for the sake of our servers. Your GPU-enabled bot will play fewer games and may change ranking more slowly than standard bots.

Installing Dependencies
Instead of packaging your dependencies inside your zip file, you may include a bash file named that will run before your bot is compiled.

Your bash file runs with internet access and write access to only its current directory, and cannot make global installation changes (i.e. apt-get will not work). You have curl and wget to download additional runtimes, tools, and environments. You have a maximum of 10 minutes to run your installation script.

For example, to install dependencies from the SVM starter kit, your bash file should include the following:

python3.6 -m pip install --target . tqdm
python3.6 -m pip install --target . zstd

Bot Compilation
Bot compilation is done using this autocompile script. Many languages will be properly auto-detected and compiled if necessary without the need for an script.

Your main file must be called MyBot.{ext}. The autocompile script recognizes the language of your bot by its .{ext}.

Exceptions to the .{ext} convention:

  • Rust: a Cargo.toml in the root is detected as Rust, and compiled with cargo rustc.
  • Swift: a Package.swift in the root is detected as Swift, and compiled with swift build.
  • Haskell: You may upload a MyBot.hs, or you may upload a stack.yaml, in which case your code compiles with stack build.
  • Elixir: Upload a mix.exs. Your code compiles with mix deps.get followed by mix
  • Clojure: Upload a project.clj. Your code compiles with lein uberjar.
  • .NET: Upload a MyBot.csproj or MyBot.fsproj. Your code compiles with dotnet restore followed by dotnet build.

JVM Languages Tip:

  • For JVM languages, you can submit a jar file inside of your zip file instead of source files. The jar will be executed java -jar MyBot.jar, so you need to define a Main-Class header in the manifest.

Installed Languages and Packages
The following languages and packages are preinstalled on the game servers:

Languages and Compilers Supported

  • cmake 3.10.2
  • Clojure (Leiningen 2.8.1)
  • CoffeeScript 1.12.7
  • Common Lisp (SBCL 1.4.5.debian)
  • Conda 4.5.11
  • Dart 2.0.0
  • D - DMD64 D Compiler 2.082.0
  • Elixir 1.7.3 (compiled with Erlang/OTP 20)
  • Erlang (SMP, ASYNC_THREADS, HIPE) (BEAM) emulator version 10.1
  • Forth (Gforth 0.7.3)
  • Fortran (GFortran 7.3.0)
  • GCC 7.3.0
  • Go 1.11.1
  • Groovy 2.5.2
  • Haskell 8.6.1
  • JavaScript (NodeJS 10.12.0-1)
  • Julia 1.0.0
  • Lua 5.2.4
  • Lua 5.3
  • LuaJIT 2.1.0-beta3
  • Mono JIT compiler version
  • OCaml 4.05.0
  • Octave 4.2.2
  • PHP 7.2
  • Python 3.6.6
  • PyPy 6.0.0 with GCC 7.2.0
  • Cython 0.26.1-0.4
  • Racket 6.11
  • Ruby 2.5.1
  • Rustc 1.29.1
  • Rustup 1.14.0
  • Cargo 1.29.0
  • Scala 2.11.12-2
  • Scheme (MIT Scheme 9.1.1)
  • Swift 4.2


Python 3.6 Packages

  • numpy 1.15.2
  • scipy 1.1.0
  • scikit-learn 0.20.0
  • flask 1.0.2
  • tensorflow 1.11.0
  • torchvision 0.2.1
  • pillow 5.3.0
  • opencv-python
  • h5py 2.8.0
  • keras 2.2.4
  • theano 1.0.3
  • shapely 1.6.4.post2
  • cython 0.28.5
  • pandas 0.23.4

PyPy 6.0 (Python 3.5) Packages

  • numpy 1.15.2
  • scipy 1.1.0
  • scikit-learn 0.20.0
  • flask 1.0.2

Ruby Gems

  • bundler (1.16.6)

GPUs have these packages installed as well:

  • libcupti-dev:amd64 9.1.85-3ubuntu1
  • libcudnn7
  • nvidia-410 410.48-0ubuntu1
  • cuda-9-0 9.0.176-1
  • tensorflow-gpu 1.11.0
  • torch 0.2.0.post3

Build a Machine Learning Bot
Upload jar file for pip install for pip install for pip install
Lua 5.2 disabled

So I have two questions:

1 Since the BEAM for Erlang is compiled with SMP is multiprocessing being restricted in some way?

Had me curious how that would work. Erlang usually auto scales the number of schedulers to the number of cores available with the default SMP flag. Is the number of schedulers being restricted (or do the vCPUs only contain a single core)?

2 For the GPU enabled bots are we able to use CUDA at all without going through one of the GPU libraries like Tensorflow? I assume we cant use CUDA from C++, Python, or Fortran?



@grantmwilliams We use cgroup cpusets to pin your bot to a specific vCPU. However, your bot will still see that it runs on a 4-core machine. If I understand correctly, that means BEAM will probably use 4 schedulers. We can tweak parameters to BEAM if that makes a difference.

You can use CUDA directly, but the development headers aren’t installed on the compilation workers, which would make that hard. You could build the bot locally, dynamically linking to the right version of CUDA, then submit that, but that would be tricky.


@lidavidm, thank you!


The C++ Starter Kit uses “make_unique” (line 26, game_map.cpp) which is c++14, but the server config builds against c++11. My bot fails to build unless I provide my own version of make_unique. Is this intentional?


Supplying a CMakeLists.txt seems to allow us to override this.


I haven’t seen it anywhere, do bots ‘install’ or ‘compile’ for every match or just upon upload?
Or, conversely, is a bot persistent between matches?


Bots get compiled on upload, and then that compiled bot is used fresh every time it plays a new match. Bots can’t persist data between matches - you start each game just as compiled.


Correct me if I’m wrong… torchvision is installed but pytorch is not? Is there a reason for this?

Edit: I submitted a bot that imports pytorch and it runs, so perhaps it’s just not listed here.


Hi, I’m confused about the .NET environment: “dotnet build\restore” are .NET Core commands, but the compiler listed is Mono (which I thought used “xbuild”)… is one of these incorrect, or is there some kind of cross-compilation magic going on?



We have both installed. If you include a MyBot.csproj or MyBot.fsproj, we will compile via .NET Core. If you upload a MyBot.cs instead, we will compile through Mono.


I’m wondering. If I use my own CUDA kernels in my Rust bots (using libc and not for machine learning), will it be compiled at server environment? (If yes - how?).

Need help getting a Nim Bot to run on the server