Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Eclectic Extraction of Propositional Rules from Neural Networks (1110.0214v1)

Published 2 Oct 2011 in cs.LG, cs.AI, cs.CV, and cs.NE

Abstract: Artificial Neural Network is among the most popular algorithm for supervised learning. However, Neural Networks have a well-known drawback of being a "Black Box" learner that is not comprehensible to the Users. This lack of transparency makes it unsuitable for many high risk tasks such as medical diagnosis that requires a rational justification for making a decision. Rule Extraction methods attempt to curb this limitation by extracting comprehensible rules from a trained Network. Many such extraction algorithms have been developed over the years with their respective strengths and weaknesses. They have been broadly categorized into three types based on their approach to use internal model of the Network. Eclectic Methods are hybrid algorithms that combine the other approaches to attain more performance. In this paper, we present an Eclectic method called HERETIC. Our algorithm uses Inductive Decision Tree learning combined with information of the neural network structure for extracting logical rules. Experiments and theoretical analysis show HERETIC to be better in terms of speed and performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Ridwan Al Iqbal (4 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.