You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Current »

Open Cognitive Environment

Welcome to the OpenCE project. The project contains everything that is needed to build conda packages for a collection of machine learning and deep learning frameworks. All packages created for a specific version of OpenCE have been designed to be installed within a single conda environment.

Environmentopence-v1.3.1opence-v1.2.2opence-v1.1.2opence-v1.0.0
python3.8.03.8.03.8.123.8.12
cuda11.2.211.0.22110.2.8910.2.89
cudnn8.1.18.1.17.6.57.6.5
nccl2.8.32.7.82.7.82.7.8
openmpi4.1.13.1.33.1.33.1.3
apex0.1N/AN/AN/A
hdf51.10.41.10.61.10.41.10.4
horovod0.21.30.21.00.21.00.19.5
ipython7.27.07.29.07.29.07.28.0
matplotlib3.4.33.4.33.4.33.4.2
mpi4py3.1.1N/AN/AN/A
numpy1.21.21.19.51.19.51.19.2
onnx1.7.01.6.01.6.01.6.0
opencv3.4.144.5.03.4.103.4.10
pandas1.3.21.3.41.3.41.2.4
pytorch1.8.11.7.11.7.11.6.0
scikit-learn0.24.21.0.11.0.11.0.1
scipy1.7.11.7.11.4.11.4.1
tensorflow2.5.12.4.12.4.12.3.1
tensorboard2.5.02.4.12.4.12.3.0
transformers4.4.22.1.14.12.24.12.2

Simple Example with TensorFlow

Interactive mode

Get a node for interactive use:

swrun -p gpux1

Once on the compute node, load PowerAI module using one of these:

module load opence
module load opence-v1.3.1

Copy the following code into file "mnist-demo.py":

import tensorflow as tf
mnist = tf.keras.datasets.mnist

(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

model = tf.keras.models.Sequential([
  tf.keras.layers.Flatten(input_shape=(28, 28)),
  tf.keras.layers.Dense(512, activation=tf.nn.relu),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Dense(10, activation=tf.nn.softmax)
])
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.fit(x_train, y_train, epochs=5)
model.evaluate(x_test, y_test)

Train on MNIST with keras API:

python ./mnist-demo.py

Batch mode

The same can be accomplished in batch mode using the following tf_sample.swb script:

wget https://wiki.ncsa.illinois.edu/download/attachments/82510352/tf_sample.swb
sbatch tf_sample.swb
squeue

Visualization with TensorBoard

Interactive mode

Get a node for interactive use:

swrun -p gpux1

Once on the compute node, load PowerAI module using one of these:

module load opence
module load opence-v1.3.1

Download the code mnist-with-summaries.py to $HOME folder:

cd ~
wget https://wiki.ncsa.illinois.edu/download/attachments/82510352/mnist-with-summaries.py

Train on MNIST with TensorFlow summary:

python ./mnist-with-summaries.py

Batch mode

The same can be accomplished in batch mode using the following tfbd_sample.swb script:

wget https://wiki.ncsa.illinois.edu/download/attachments/82510352/tfbd_sample.swb
sbatch tfbd_sample.swb
squeue

Start the TensorBorad session

After job completed the TensorFlow log files can be found in "~/tensorflow/mnist/logs", start the TensorBoard server on hal-ondemand, detail refers Getting started with HAL OnDemand.

Simple Example with Pytorch

Interactive mode

Get a node for interactive use:

swrun -p gpux1

Once on the compute node, load PowerAI module using one of these:

module load opence
module load opence-v1.3.1

Install samples for Pytorch:

pytorch-install-samples ~/pytorch-samples
cd ~/pytorch-samples

Train on MNIST with Pytorch:

python ./examples/mnist/main.py
  • No labels