Note that the Tensorflow version depends on what is supported by SyntaxNet. Building a working SN installation requires some additional operations not supported by TF as is.
This version uses the latest DRAGNN mode, which is much faster than the original SyntaxNet implementation!
Note that lang is the subfolder name in model_dir which should contain the language specific segmenter and parser models. The default directory to search for models is /usr/local/tfmodels/. If you have downloaded and extracted the conll17.zip via the instructions above, you can launch the container like this:
# Mount the extracted models dir on host machine as volume in container
docker run -it -v <path/to/extracted/zip>:/usr/local/tfmodels/ nardeas/tensorflow-syntaxnet
and the above example should work out of the box. Only provide lang parameter to constructor (default "English").
Notes
This image contains a full Tensorflow installation. Any readily
available pre-trained models are excluded from this image to keep it as
lean as possible. Having SyntaxNet support doesn't produce much overhead
so this image is well suited for use with any other TF applications as
well.
Also note that this version doesn't include Bazel ops from the original SN. In other words you won't use stuff like bazel-bin/syntaxnet/parser_eval - you should use DRAGNN parser instead. The easiest way to get up and running fast is using the included DRAGNN wrapper.