
iav ML Study Group Pop Quiz 2
Authored by Bob Balooey
Other, Mathematics
1st - 2nd Grade
Used 3+ times

AI Actions
Add similar questions
Adjust reading levels
Convert to real-world scenario
Translate activity
More...
Content View
Student View
14 questions
Show all answers
1.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
K-Nearest Neighbours, Random Forests and Gradient Boosting are all algorithms which...
...can only be used for Classification if there are fewer features than samples
...can only be used for Regression if there are fewer features than samples
...can and will be used for both Regression and Classification
...are only academic theories, and are not practical for real ML solutions
2.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
ROC stands for
Receiver Operating Curve
the mythical Roc, a giant bird from the legend of Sinbad the sailor
Regressive Orthogonal Curve
Random Omissions Cumulative
3.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
For a dataset with IMBALANCED classes, we would generally choose as our error metric:
Precision-Recall Curve, because we would expect a trade-off between precision and recall
ROC Curve, because it is accepted industry practice
Parabolic Curve, because the data is frequently quadratic
Straight-Line Entropy, because we want the most accurate measurement of error
4.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
How do we determine the correct value of K for KNN?
Trick Question: KNN provides this value.
As a hyperparameter, K is extrinsic to the dataset and must be arrived at via an iterative series of experiments.
As a parameter, K is intrinsic to the dataset and can simply be read out of it.
K can be derived synthetically from other statistical measures like sigma.
5.
MULTIPLE SELECT QUESTION
45 sec • 1 pt
We can describe upsampling and downsampling for imbalanced datasets as follows (pick all correct options):
Downsampling adds importance to the MINOR class, sending recall up and precision down
Downsampling adds importance to the MAJOR class, sending recall down and precision up
Upsampling will mitigate excessive weight on the MINOR class. Recall will still be higher than precision, but the gap will lessen
Upsampling will mitigate excessive weight on the MAJOR class. Precision will still be higher than recall, but the gap will lessen
6.
FILL IN THE BLANK QUESTION
1 min • 1 pt
These are the four steps in blagging, possibly out-of-order:
1) BALANCE each sample by downsampling
2) MAJORITY vote
3) BOOTSTRAP samples from the population
4) LEARN a decision tree
To answer the question, put all 4 steps in order with no spaces in the box e.g. "6958"
7.
MULTIPLE CHOICE QUESTION
30 sec • 1 pt
The broadest and most flexible ensemble method, which involves the mixing of an arbitrary number of models without any limits on their types is called:
Piling
Stacking
Chaining
Combining
Access all questions and much more by creating a free account
Create resources
Host any resource
Get auto-graded reports

Continue with Google

Continue with Email

Continue with Classlink

Continue with Clever
or continue with

Microsoft
%20(1).png)
Apple
Others
Already have an account?