Exercise 1¶
Last week, you set up a closure for a neural network layer. This week, you'll build a Layer
class. Here are the requirements:
- The constructor should have two arguments (in addition to
self
):shape
: A datastructure consisting of the shape of the layer. You can choose which datastructure to use here. Obvious choices are tuple and list. There may be others, but think about easy of implementation and use. Keep the same convention as last week. That is, the first slot should be the number of inputs to the layers and the second slot should be the number of nodes in the layer.actv
: The activation function
- The constructor should initialize the weights and biases of the correct size with random values. These weights and biases should be attributes of the instance.
- Make the activation function another attribute of the instance.
- The class should have an instance method called
forward
, which calculates the output of the layer. forward
should take one argument (in addition toself
):inputs
.
Here's some pseudocode on how a user might want to use your class.
layer1 = Layer(shape1, actv)
layer2 = Layer(shape2, actv)
h1 = layer1.forward(inputs)
h2 = layer2.forward(h1)
Exercise 2¶
Spice up your Layer
class with some special methods. You must include __str__
and __repr__
and at least one other dunder method of your choosing. Dunder methods are fun to play with, so feel free to try a few out.
Deliverables¶
exercise.py
with the complete class as specified in exercises 1 and 2.