I am trying to link these somewhat disparate concepts- a binary classifier and a fractal structure as I think that a binary classifier based on such a structure, in particular Mandelbrot set would be considerably more accurate than others- of course random forest classifier is bound to give tough competition for that.
We all are familiar with Mandelbrot set. The fractal boundary separates the complex numbers which remain bounded after a sizable number of iterations of the map
$z_{n+1}= (z_n)^2+c$, where $c$ is a complex constant and $z_{n+1}$, $z_n$ are the values of the complex number at $(n+1)$th and $n$th iterations respectively
and the ones that rush to infinity or "die".
Problem before us is the standard classification problem in machine learning:
Consider a data set where $X_1,X_2,\cdots X_n$ are independent variables-quantitative numerical variables, all of them. And $Y$ be the dependent variable-binary, takes values $(0,1)$.
We want to classify the observations, i.e. we wish to know under what conditions on the values of $X_1,X_2,\cdots X_n$ is the value of $Y$ either $0$ or $1$?
This problem is solved in many ways. One of them is to construct kernel functions as done in SVM. The idea is to link somehow the kernel function with mandelbrot set.
Why mandelbrot set?? because its boundary is fractal, and it has
Has there been such a classifier constructed before?
Any comments about its accuracy/feasibility etc.? I have not visualized this function in detail, but I do think that such a classifier would have some decisive advantages over learning machines that use commonlu used kernel functions like linear, polynomial, etc.
ADDED: Henning Makholm provided valuable inputs as to the vagueness of my question.