makeporngreatagain.pro
yeahporn.top
hd xxx

Practice Test 2 | Google Cloud Certified Professional Data Engineer | Dumps | Mock Test

5,229

You are working on building your own machine learning model and training it. When you tested the model on a testing set, you realized the error rate is very high and the model’s output only matched 25% of expected output.

What is the problem you are facing and how can you fix it?

A. The model is underfitting: You need to increase the features and use more training data.
B. The model is underfitting: You need to lower the features and use less training data.
C. The model is overfitting: You need to lower the features and use more training data.
D. The model is overfitting: You need to increase the features and use more training data.

Answer: C.

Overfitting happens when a model goes well on a training set, generating only a small error, while giving wrong output for the test set. This happens because the model is only picking up specific features input found in the training set instead of picking out general features of the given training set.

The opposite of overfitting is underfitting. Underfitting occurs when there is still room for improvement on the test data. This can happen for a number of reasons: If the model is not powerful enough, is over-regularized, or has simply not been trained long enough. This means the network has not learned the relevant patterns in the training data.

To solve overfitting, the following would help improving the model’s quality:

    • Increase the number of examples, the more data a model is trained with, the more use cases the model can be training on and better improves its predictions.
    • Tune hyperparameters which is related to number and size of hidden layers (for neural networks), and regularization, which means using techniques to make your model simpler such as using dropout method to remove neuron networks or adding “penalty” parameters to the cost function.
    • Remove features by removing irrelevant features. Feature engineering is a wide subject and feature selection is a critical part of building and training a model. Some algorithms have built- in feature selection, but in some cases, data scientists need to cherry-pick or manually select or remove features for debugging and finding the best model output.

 

From the brief explanation, to solve the overfitting problem in the scenario, you need to:

    • Increase the training set.
    • Decrease features parameters.

 

Hence, answer C is correct.

Answer A & B are incorrect: The problem in this scenario is not underfitting.

Answer D is incorrect: You should work on decreasing the features to solve overfitting, not increasing them.

Source(s):

Overfitting and underfitting: https://www.tensorflow.org/tutorials/keras/overfit_and_underfit

Comments are closed, but trackbacks and pingbacks are open.

baseofporn.com https://www.opoptube.com
Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.