Backdoor Defense and Attack Strategies Quiz

Backdoor Defense and Attack Strategies Quiz

University

16 Qs

quiz-placeholder

Similar activities

Anti-Backdoor Learning Quiz

Anti-Backdoor Learning Quiz

University

14 Qs

Backdoor Attack Quiz

Backdoor Attack Quiz

University

14 Qs

Logical Fallacies

Logical Fallacies

8th Grade - University

20 Qs

EXPOSITORY WRITING

EXPOSITORY WRITING

University

16 Qs

Titus Act I-II Review Quiz

Titus Act I-II Review Quiz

12th Grade - University

15 Qs

VIKINGS

VIKINGS

University

20 Qs

Ad Hominem Fallacy

Ad Hominem Fallacy

10th Grade - University

15 Qs

Anne of Green Gables Intermediate

Anne of Green Gables Intermediate

University

11 Qs

Backdoor Defense and Attack Strategies Quiz

Backdoor Defense and Attack Strategies Quiz

Assessment

Quiz

English

University

Hard

Created by

Crappy Things

FREE Resource

16 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

According to the simplified illustration, how does a backdoor trigger alter the model's decision-making process?

It pushes the decision boundary further away from the non-target classes (B and C), making them harder to misclassify.

It creates a 'shortcut' or a new, small region of misclassification (backdoor area) for non-target classes that is very close to their original location in the feature space.

It moves the representations of all inputs (A, B, and C) into a single point in the feature space.

It only affects the representations of the target class A, making them more robust.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The 'Minimum Δ needed to misclassify all samples into A' is a key metric. What does a significantly smaller Δ for an infected model imply?

The model is poorly trained and generally unstable.

The model has learned a highly efficient pathway (the backdoor) to the target class A for inputs from other classes, requiring minimal perturbation.

The clean model was already close to misclassifying everything as A.

The trigger pattern itself is very large and complex.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

True or False: The illustration shows that in an infected model, a triggered input from class B is represented in the exact same location in the feature space as a clean input from class A.

True

False

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

True or False: Neural Cleanse operates on the assumption that an attacker wants to make the backdoor trigger as large and noticeable as possible to ensure its effectiveness.

True

False

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

The slide states, 'On CIFAR-10, even if the poisoning rate is less than 1%, various attacks can still achieve high attack success rates.' What is the most critical implication of this for defense design?

Defenses must be able to perfectly identify every single backdoored sample to be effective.

Simply removing a random 1% of the training data is a viable defense strategy.

The backdoor signal is very strong and easily learned, so defenses must be highly sensitive and cannot rely on the rarity of poisoned samples alone.

The CIFAR-10 dataset is inherently flawed and should not be used for security research.

6.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the primary risk of a defense strategy that 'may accidentally remove a lot of valuable data when the dataset is completely clean'?

It increases the time and computational cost of training.

It would alert the attacker that a defense is in place.

It degrades the model's performance on its primary task (i.e., hurts clean accuracy) by removing valid training examples.

It might remove the wrong backdoored samples, leaving the most effective ones in the dataset.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

True or False: The bar chart suggests that the 'Blend' attack is generally less effective than the 'Trojan' attack at lower poisoning rates.

True

False

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?