7
$\begingroup$

In my machine learning class I have been provided a weight vector that has the property that it is generously feasible ?

Formally, what does generously feasible mean? I can't seem to find a definition?

  • 0
    The tinyurl to slide 23 is no longer valid. Here's something current: http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec2.pdf#232016-10-12

2 Answers 2

3

If the weight vector in the current iteration is in the region between the hyperplane and the magnitude of input vector, i.e. $\vec{w^t_x} \: \epsilon \: [ \langle \vec{w_{x}},\vec{x} \rangle , |\vec{x}| ]$, where $\langle \vec{w_{x}},\vec{x} \rangle$ is the hyperplane, then, since the perceptron adds $\vec{x}$ or $-\vec{x}$ to the weights each iteration, the weight vector will oscillate around the hyperplane. Hence for the algorithm to terminate with a solution, it should be allowed to accept a solution in this feasible space, hence called the "generously feasible" space.

  • 0
    If there is oscillation, doesn't that mean that the hypercone of feasible solutions not even exists, considering it is a convex-convergence problem?2017-02-05
1

So consider “generously feasible” weight vectors that lie within the feasible region by a margin at least as great as the length of the input vector that defines each constraint plane.

Every time the perceptron makes a mistake, the squared distance to all of these generously feasible weight vectors is always decreased by at least the squared length of the update vector.