MJRoBot (Marcelo Rovai) / ecg_project_test Public
Primary version

Training settings

Please provide a valid number of training cycles (numeric only)
Please provide a valid number for the learning rate (between 0 and 1)
Please provide a valid training processor option

Augmentation settings

Advanced training settings

Neural network architecture

import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, InputLayer, Dropout, Conv1D, Conv2D, Flatten, Reshape, MaxPooling1D, MaxPooling2D, BatchNormalization from tensorflow.keras.optimizers import Adam sys.path.append('./resources/libraries') import ei_tensorflow.training X_train = np.expand_dims(X_train, 2) X_test = np.expand_dims(X_test, 2) samples, features, depht = X_train.shape train_dataset = tf.data.Dataset.from_tensor_slices((X_train, Y_train)) validation_dataset = tf.data.Dataset.from_tensor_slices((X_test, Y_test)) # model architecture model = Sequential() model.add(Reshape((int(input_length / 2), 2), input_shape=(input_length, ))) model.add( Conv1D(8, kernel_size=7, activation='relu', padding='same', input_shape=(features, depht))) model.add(Conv1D(8, kernel_size=7, activation='relu', padding='same')) model.add(MaxPooling1D(pool_size=5, strides=2, padding='same')) model.add(Dropout(0.2)) model.add(Flatten()) model.add( Dense(4, activation='relu', activity_regularizer=tf.keras.regularizers.l1(0.00001))) model.add(Dropout(0.2)) model.add(Dense(classes, activation='softmax', name='y_pred')) # this controls the learning rate opt = Adam(lr=0.0005, beta_1=0.9, beta_2=0.999) # this controls the batch size, or you can manipulate the tf.data.Dataset objects yourself BATCH_SIZE = 32 train_dataset = train_dataset.batch(BATCH_SIZE, drop_remainder=False) validation_dataset = validation_dataset.batch(BATCH_SIZE, drop_remainder=False) # train the neural network model.compile(loss='categorical_crossentropy', optimizer=opt, metrics=['accuracy']) model.fit(X_train, Y_train, epochs=30, validation_data=(X_test, Y_test), verbose=2, callbacks=callbacks)
Input layer (100 features)
Dense layer (20 neurons)
Dense layer (10 neurons)
Output layer (2 classes)