Trending Bestseller

Support Vector Machines for Pattern Classification

Shigeo Abe

No reviews yet Write a Review
Paperback / softback
04 May 2012
$280.00
Ships in 3-5 business days
Hurry up! Current stock:

A guide on the use of SVMs in pattern classification, including a rigorous performance comparison of classifiers and regressors. The book presents architectures for multiclass classification and function approximation problems, as well as evaluation criteria for classifiers and regressors. Features: Clarifies the characteristics of two-class SVMs; Discusses kernel methods for improving the generalization ability of neural networks and fuzzy systems; Contains ample illustrations and examples; Includes performance evaluation using publicly available data sets; Examines Mahalanobis kernels, empirical feature space, and the effect of model selection by cross-validation; Covers sparse SVMs, learning using privileged information, semi-supervised learning, multiple classifier systems, and multiple kernel learning; Explores incremental training based batch training and active-set training methods, and decomposition techniques for linear programming SVMs; Discusses variable selection for support vector regressors.

This product hasn't received any reviews yet. Be the first to review this product!

$280.00
Ships in 3-5 business days
Hurry up! Current stock:

Support Vector Machines for Pattern Classification

$280.00

Description

A guide on the use of SVMs in pattern classification, including a rigorous performance comparison of classifiers and regressors. The book presents architectures for multiclass classification and function approximation problems, as well as evaluation criteria for classifiers and regressors. Features: Clarifies the characteristics of two-class SVMs; Discusses kernel methods for improving the generalization ability of neural networks and fuzzy systems; Contains ample illustrations and examples; Includes performance evaluation using publicly available data sets; Examines Mahalanobis kernels, empirical feature space, and the effect of model selection by cross-validation; Covers sparse SVMs, learning using privileged information, semi-supervised learning, multiple classifier systems, and multiple kernel learning; Explores incremental training based batch training and active-set training methods, and decomposition techniques for linear programming SVMs; Discusses variable selection for support vector regressors.

Customers Also Viewed

Buy Books Online at BookLoop

Discover your next great read at BookLoop, Australiand online bookstore offering a vast selection of titles across various genres and interests. Whether you're curious about what's trending or searching for graphic novels that captivate, thrilling crime and mystery fiction, or exhilarating action and adventure stories, our curated collections have something for every reader. Delve into imaginative fantasy worlds or explore the realms of science fiction that challenge the boundaries of reality. Fans of contemporary narratives will find compelling stories in our contemporary fiction section. Embark on epic journeys with our fantasy and science fiction titles,

Shop Trending Books and New Releases

Explore our new releases for the most recent additions in romance books, fantasy books, graphic novels, crime and mystery books, science fiction books as well as biographies, cookbooks, self help books, tarot cards, fortunetelling and much more. With titles covering current trends, booktok and bookstagram recommendations, and emerging authors, BookLoop remains your go-to local australian bookstore for buying books online across all book genres.

Shop Best Books By Collection

Stay updated with the literary world by browsing our trending books, featuring the latest bestsellers and critically acclaimed works. Explore titles from popular brands like Minecraft, Pokemon, Star Wars, Bluey, Lonely Planet, ABIA award winners, Peppa Pig, and our specialised collection of ADHD books. At BookLoop, we are committed to providing a diverse and enriching reading experience for all.