ML2 Comparisson of Models

ML2 Comparisson of Models

12th Grade

14 Qs

quiz-placeholder

Similar activities

9.1 Weathering (Regents Earth Science)

9.1 Weathering (Regents Earth Science)

8th - 12th Grade

15 Qs

Earth's Atmosphere

Earth's Atmosphere

11th - 12th Grade

15 Qs

Rock Record

Rock Record

8th Grade - University

15 Qs

Formative 1.5 The Switch

Formative 1.5 The Switch

9th - 12th Grade

10 Qs

W/E/D CRQ Practice Test for Regents Earth Science

W/E/D CRQ Practice Test for Regents Earth Science

7th - 12th Grade

15 Qs

Midterm Review Part II

Midterm Review Part II

9th - 12th Grade

18 Qs

Transport in cells

Transport in cells

9th - 12th Grade

16 Qs

Text Features

Text Features

3rd Grade - University

15 Qs

ML2 Comparisson of Models

ML2 Comparisson of Models

Assessment

Quiz

Science

12th Grade

Hard

Created by

jaime bustamante

Used 2+ times

FREE Resource

14 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is a key advantage of Naïve Bayes?

Processes all cases at root

Handles real-valued parameters well

Requires feature transformation

Robust to irrelevant features

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

When is Naïve Bayes optimal as a classifier?

When the problem depends on many features

When linear combinations of features are critical

When the assumption of independence holds

When the model depends on summing contributions of many attributes

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is an advantage of Decision Trees?

Recursive partitioning is relatively fast

Requires feature transformation

Complex to explain

Handles real-valued parameters well

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

When is XGBoost recommended to be used?

When good performance is needed at the cost of computation

When model explainability is crucial

When regression works well

When overfitting is a big concern

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is a key feature of Random Forests?

Increases Variance

Dependent classifiers

Handles overfitting well

Requires feature transformation

6.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

Which ones are true?

Bagging handles overfitting, Boosting reduces bias

Bagging is independent classifiers, Boosting is sequential classifiers

Bagging is sequential, Boosting is parallel

Bagging reduces bias, Boosting reduces variance

7.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

When do we use LDA?

When Logistic Regression is stable

When the number of samples is small

When there are only two classes

When Logistic Regression doesn't work

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?