Very Large Loss Values When Training Multiple Regression Model In Keras
I was trying to build a multiple regression model to predict housing prices using the following features: [bedrooms bathrooms sqft_living view grade] = [0.09375 0.266667
Solution 1:
A lot of people believe in scaling everything. If your y
goes up to 8 million, I'd scale it, yes, and reverse the scaling when you get predictions out, later.
Don't worry too much about specifically what loss
number you see. Sure, 40 trillion is a bit ridiculously high, indicating changes may need to be made to the network architecture or parameters. The main concern is whether the validation loss is actually decreasing, and the network actually learning therewith. If, as you say, it 'went down to 0.2 after 400 epochs', then it sounds like you're on the right track.
There are many other loss functions besides log-mse, mse, and mae, for regression problems. Have a look at these. Hope that helps!
Post a Comment for "Very Large Loss Values When Training Multiple Regression Model In Keras"