dchalmer Posted September 17, 2013 Share Posted September 17, 2013 I have a two-class classification problem. I would like to train a multivariate classifier with 100% positive predictive value. In other words, I want the model to completely avoid one of the classes. For this application a low-ish sensitivity is OK as long as PPV is ~100%.Do you have any suggestions of good techniques to use? Thank you! Link to comment Share on other sites More sharing options...
DMX Posted September 17, 2013 Share Posted September 17, 2013 (edited) do a logistic regression and pick a point along the ROC curve as you see fit (depending on what your precision/recall needs to be ) Edited September 17, 2013 by DMX Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now