0

I've been exploring methods to train multiple models within a single codebase. The aim is to generate various models using different seeds to assess diverse architectures and training data. The data is generated with random seeds to allow for varied data splits. However, I've encountered discrepancies in the results between training a single model and training a set of models. I suspect that the initial model's training may influence subsequent training sessions. Is there a way to ensure complete independence between the training processes of each model?

This is my code:

def build _model(self,seed):
print ("Seed:", seed)
regularizer = keras.regularizers.L1L2(11=1-5,12=1-4)
initializer = keras.initializers.GlorotUniform(seed=seed)
model = keras .Sequential([
layers. Input(shape=self.input_shape),
layers Dense(7, activation='sigmoid' ,kernel_initializer = initializer, kernel_regularizer=regularizer),
layers. Dense(1, activation = 'linear', kernel_initializer = initializer)
model. compile(loss=lambda y_true, y_pred: self.huber_lossy_true, y_pred,
delta=0.35),
optimizer= 'adam', metrics=['mae', "mse' ])
return model
# 1.12 Model Training
def train_models(self, model, epochs, batch_size):
early_stop = keras.callbacks.EarlyStopping(monitor='val_loss', patience=50)
histories = []
# for i in range(len(seeds)):
# Train the model
for i in range (len(self.seeds)):
reset_weights (model)
# Set the random seed for TensorFlow operations
tf.random.set_seed(self.seeds[i])
# Train the model
with tf.compat.v1.Session() as sess:
tf.compat.v1.keras.backend.set_session (sess)
history = model.fit(
self.normed_train_data[1],
self.train_labels[il,
epochs=epochs, validation_split=0.2, verbose=1, batch_size=batch_size, callbacks=[PrintDot(), early_stop]
histories.append (history)
return histories
New contributor
Agenor Maradiaga is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.