WekaDeeplearning4j comes with an automated JUnit test suite, however, some bugs may only occur in the GUI version of WEKA; because of this, it's important to test a range of cases in the GUI manually before confirming the package is ready for release.
These scenarios can be tested either:
- Manually, as per the instructions, or;
- Automatically, by running the parenthesised
.shscript fromweka-run-test-scripts/. This folder also containsrun_all_tests.sh, which will run all of these tests (you should still check the output to ensure parameters are applied correctly). Check out the README for more info.
The following are run with randomly generated data (using the default Generate window)
- (
Dl4jMlpClassifier_random_default.sh) Run with all default parameters
The following are run with the mnist-minimal dataset loaded, with the ImageInstanceIterator using the following settings:
directory of imagespointing to themnist-minimal/image folderdesired width = 224desired height = 224desired number of channels = 3
You may like to set number of epochs to something smaller and set Test options to
Percentage split to run these quicker.
- (
Dl4jMlpClassifier_mnist_default.sh) Run model with all default parameters - (
Dl4jMlpClassifier_mnist_extraLayer.sh) Run model with an addedDenseLayerwithnOut = 32 - (
Dl4jMlpClassifier_mnist_AlexNet.sh) Run model withDl4jAlexNetas the zoo model - (
Dl4jMlpClassifier_mnist_ResNet50.sh) Run model withDl4jResNet50as the zoo model - (
Dl4jMlpClassifier_mnist_EffNetB2.sh) Run model withKerasEfficientNetas the zoo model, withEFFICIENTNET_B2as the variation- Ensure that the model actually uses the variation, you should see a log message something like
...Using cached model at /home/rhys/.deeplearning4j/models/keras_efficientnet/KerasEfficientNetB2.zip...
- Ensure that the model actually uses the variation, you should see a log message something like
The following are run with the mnist-minimal dataset loaded. You'll need to click Undo after each test to revert the instances.
- (
Dl4jMlpFilter_mnist_default.sh) Run with default filter settings (usesDl4jResNet50as the model) - (
Dl4jMlpFilter_mnist_extraLayer.sh) Run withDl4jResNet50as zoo model, setUse default feature layerto false, and addres4a_branch2bto thefeature extraction layers- Ensure that the logging output contains something like
...Getting features from layers: [res4a_branch2b, flatten_1]and that there are attributes from both feature layers (i.e., namedres4a_branch2bandflatten_1)
- Ensure that the logging output contains something like
- (
Dl4jMlpFilter_mnist_DenseNet.sh) Run withKerasDenseNetas the zoo model andUse default feature extraction layerset toTrue - (
Dl4jMlpFilter_mnist_ResNet101v2.sh) Run withKerasResNetas the zoo model, variation set toRESNET101V2, andUse default feature extraction layerset toTrue- Ensure that
RESNET101V2is actually used as the variation. You should see a logging message something like...Using cached model at /home/rhys/.deeplearning4j/models/keras_resnet/KerasResNet101V2.zip
- Ensure that
- (
Dl4jMlpFilter_mnist_EffNetB1ExtraLayer.sh) Run withKerasEfficientNetas the zoo model, variation set toEFFICIENTNET_B1,Use default feature layertoFalse, and addblock4c_expand_convto thefeature extraction layers.- Again, ensure the variation is properly set, and the resulting attributes contain features from both layers.
The following are run using the mnist_784 convolutional dataset (src/test/resources/nominal/mnist_784_train_minimal.arff)
- (
Dl4jMlpFilter_mnist_LeNet.sh) Run withDl4jLeNetas the zoo model,Use default feature layertoTrue, and a defaultConvolutionInstanceIteratoras theinstance iterator