代写CSCI-567 – Machine Learning Sample Quiz 2 Fall 2024代写留学生数据结构程序
- 首页 >> WebSample Quiz 2
CSCI-567 – Machine Learning
Fall 2024
1. Instruction tuning is often formulated as a supervised learning problem where the model is fine-tuned on input-output pairs. If yi is the expected output, xi is the input, and zi is the instruction, which of the following is the correct loss function?
a) L(θ) = − log Pθ(xi|yi, zi)
b) L(θ) = − log Pθ(yi|xi, zi)
c) L(θ) = − log Pθ(zi|xi, yi)
d) L(θ) = − log Pθ(zi|xi)
2. In Gaussian Mixture Models (GMMs), each component is a Gaussian distribution characterized by its mean and covariance. When fitting a GMM to data, what is the role of the covariance matrices, and how do they affect the shape of the clusters?
a) The covariance matrices determine the orientation and shape of each cluster, allowing for elliptical clusters.
b) The covariance matrices are always identity matrices, resulting in spherical clusters.
c) The covariance matrices affect only the size but not the orientation of the clusters.
d) The covariance matrices are used to normalize the data before clustering.
3. Which of the following best explains how boosting improves model performance?
a) By training all weak learners independently and averaging their outputs.
b) By using a single strong learner trained on the entire dataset.
c) By sequentially training weak learners where each learner focuses on correcting the errors of the previous ones.
d) By randomly selecting subsets of data and features to train each weak learner.
4. Decision tree: Which one of the following statements is false?
a) Decision tree and AdaBoost both maintain the explainability of the model.
b) Decision tree can be applied to both classification and regression tasks.
c) Decision tree works with real and categorical data.
d) Decision tree is a linear model.
5. The Naive Bayes classifier makes a strong assumption about the features in the dataset. Which of the following best describes this assumption?
a) The features are linearly dependent on each other.
b) The features are conditionally independent given the class label.
c) The features follow a uniform. distribution regardless of the class label.
d) The features have equal variance across all class labels.