Wing / boxing-moves Public
Primary version

Training settings

Please provide a valid number of training cycles (numeric only)
Please provide a valid number for the learning rate (between 0 and 1)
Please provide a valid number for the train/validate split (between 0 and 1)
%

Neural network architecture

import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, InputLayer, Dropout, Conv1D, Conv2D, Flatten, Reshape, MaxPooling1D, MaxPooling2D, BatchNormalization, TimeDistributed from tensorflow.keras.optimizers import Adam # model architecture model = Sequential() model.add(Dense(20, activation='relu', activity_regularizer=tf.keras.regularizers.l1(0.00001))) model.add(Dense(20, activation='relu', activity_regularizer=tf.keras.regularizers.l1(0.00001))) model.add(Dense(10, activation='relu', activity_regularizer=tf.keras.regularizers.l1(0.00001))) model.add(Dense(classes, activation='softmax', name='y_pred')) # this controls the learning rate opt = Adam(learning_rate=0.01, beta_1=0.9, beta_2=0.999) # this controls the batch size, or you can manipulate the tf.data.Dataset objects yourself BATCH_SIZE = 8 train_dataset = train_dataset.batch(BATCH_SIZE, drop_remainder=False) validation_dataset = validation_dataset.batch(BATCH_SIZE, drop_remainder=False) callbacks.append(BatchLoggerCallback(BATCH_SIZE, train_sample_count)) # train the neural network model.compile(loss='categorical_crossentropy', optimizer=opt, metrics=['accuracy']) model.fit(train_dataset, epochs=85, validation_data=validation_dataset, verbose=2, callbacks=callbacks) # Use this flag to disable per-channel quantization for a model. # This can reduce RAM usage for convolutional models, but may have # an impact on accuracy. disable_per_channel_quantization = False
Input layer (33 features)
Output layer (5 classes)