Skip to main content

Tensorflow Keras Create Model Using Functional API - Single Output Regression With Multiple Dense Layers (Python Example)

Trying out tensorflow by creating simple neural network models is a good starting point to understand and to get used to various features of tensorflow and its api.
Keras is used here - which sits on top of tensorflow and provides a good api layer with really good documentation. 

Functional API will be demonstrated; instead of sequential API. Functional API offers more control over the model architecture, also allows creation of multi-output models, flexibility and much more.


Create regression output model using Tensorflow Keras (functional api)

Importing tensorflow


# importing tensorflow
import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, Input


Defining a function to create dense layers for convenience.

Functions like these will help with creating complex layers structure and reduce confusion.


# function to create dense layers
def dense_layer_creator(units, name_for_layer, last_layer, activation=""):
    if not activation == "":
        # if an activation value is provided
        # return a denser layer with activation
        result_output_layer = Dense(
            units=units, name=name_for_layer, activation=activation
        )(last_layer)

        return result_output_layer

    elif activation == "":
        # if no activation is passed
        # return a dense layer without activation
        result_output_layer = Dense(
            units=units,
            name=name_for_layer,
        )(last_layer)

        return result_output_layer


Defining input shape.This is usually derived from the X data / training data. Also defining the dense layer counts.


# defining the input shape
# this would depend on "X" data shape
input_shape = 3


# defining the count of dense layers
dense_layer_count = 3


Defining a function to create tensorflow model, it makes use of the previously defined variables and function.


# function to create a new model
# uses the functional api of tensorflow
# this function creates a single out regression model
def build_model():
    # defining and creating a input layer

    input_layer = Input(shape=input_shape, name="input_layer")

    previous_layer = input_layer

    dense_layers = ""

    # creating multiple dense layers using loop
    for i in range(dense_layer_count):
        if not i == 0:
            previous_layer = dense_layers

        dense_layers = dense_layer_creator(
            units=8,
            name_for_layer=f"dense_{i}",
            last_layer=previous_layer,
            activation="relu",
        )

    # not passing the activation argument for the last layer
    # mostly output layer do not have an activation
    # use activation if it is required

    output_layer = dense_layer_creator(
        units=1, name_for_layer=f"output_layer", last_layer=dense_layers
    )

    # created model
    # this has to be compiled for fitting / training
    model = Model(inputs=input_layer, outputs=output_layer)

    return model


Calling the build_model() function to generate a new model. Make sure to compile model to prepare it for training.


# generating the model and storing
model = build_model()


Creating variables for opitimizers,loss strategy and metrics.


# using the gradient descent
# tensorflow comes more optimizers
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001)


loss_strategy = tf.keras.losses.mean_squared_error
metric_watch = tf.keras.metrics.RootMeanSquaredError()


Compiling the model with above defined parameters.Models are also required to be recompiled : if layers are freezed , optimizers are changed ,loss and metrics are changed , etc. cases.


# compiling the model with required options
model.compile(
    optimizer=optimizer,
    loss=loss_strategy,
    metrics=metric_watch,
)


Used the model.summary() method to display information about the created model.You can pass show_trainable value to display additional column called Trainable. This column displays details about whether a layer can be trained or not (frozen layer); this will be handy when using transfer learning or fine-tuning only specific layers.


# viewing the model info
# pass "show_trainable=True"
# to see whether a layer is trainable
model.summary(show_trainable=True)

Model: "model_2" ____________________________________________________________________________ Layer (type) Output Shape Param # Trainable ============================================================================ input_layer (InputLayer) [(None, 3)] 0 Y dense_0 (Dense) (None, 8) 32 Y dense_1 (Dense) (None, 8) 72 Y dense_2 (Dense) (None, 8) 72 Y output_layer (Dense) (None, 1) 9 Y ============================================================================ Total params: 185 (740.00 Byte) Trainable params: 185 (740.00 Byte) Non-trainable params: 0 (0.00 Byte) ____________________________________________________________________________



Combining everything together, we have code to create a tensorflow deep learning model which uses functional api with multiple dense layers.


# importing tensorflow
import tensorflow as tf
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, Input

# function to create dense layers
def dense_layer_creator(units, name_for_layer, last_layer, activation=""):
    if not activation == "":
        # if an activation value is provided
        # return a denser layer with activation
        result_output_layer = Dense(
            units=units, name=name_for_layer, activation=activation
        )(last_layer)

        return result_output_layer

    elif activation == "":
        # if no activation is passed
        # return a dense layer without activation
        result_output_layer = Dense(
            units=units,
            name=name_for_layer,
        )(last_layer)

        return result_output_layer
        
# defining the input shape
# this would depend on "X" data shape
input_shape = 3


# defining the count of dense layers
dense_layer_count = 3

# function to create a new model
# uses the functional api of tensorflow
# this function creates a single out regression model
def build_model():
    # defining and creating a input layer

    input_layer = Input(shape=input_shape, name="input_layer")

    previous_layer = input_layer

    dense_layers = ""

    # creating multiple dense layers using loop
    for i in range(dense_layer_count):
        if not i == 0:
            previous_layer = dense_layers

        dense_layers = dense_layer_creator(
            units=8,
            name_for_layer=f"dense_{i}",
            last_layer=previous_layer,
            activation="relu",
        )

    # not passing the activation argument for the last layer
    # mostly output layer do not have an activation
    # use activation if it is required

    output_layer = dense_layer_creator(
        units=1, name_for_layer=f"output_layer", last_layer=dense_layers
    )

    # created model
    # this has to be compiled for fitting / training
    model = Model(inputs=input_layer, outputs=output_layer)

    return model


# generating the model and storing
model = build_model()


# using the gradient descent
# tensorflow comes more optimizers
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001)


loss_strategy = tf.keras.losses.mean_squared_error
metric_watch = tf.keras.metrics.RootMeanSquaredError()


# compiling the model with required options
model.compile(
    optimizer=optimizer,
    loss=loss_strategy,
    metrics=metric_watch,
)


# viewing the model info
# pass "show_trainable=True"
# to see whether a layer is trainable
model.summary(show_trainable=True)

Comments

Topics

Show more