Abstract
We consider the problem of learning k-parities in the online mistake-bound model: given a hidden vector x∈{0,1}n where the hamming weight of x is k and a sequence of “questions” a1,a2,…∈{0,1}n, where the algorithm must reply to each question with 〈ai,x〉(mod2), what is the best trade-off between the number of mistakes made by the algorithm and its time complexity? We improve the previous best result of Buhrman et al. [3] by an exp(k) factor in the time complexity. Next, we consider the problem of learning k-parities in the PAC model in the presence of random classification noise of rate [Formula Presented]. Here, we observe that even in the presence of classification noise of non-trivial rate, it is possible to learn k-parities in time better than (nk/2), whereas the current best algorithm for learning noisy k-parities, due to Grigorescu et al. [9], inherently requires time (nk/2) even when the noise rate is polynomially small.
Original language | English |
---|---|
Pages (from-to) | 249-256 |
Number of pages | 8 |
Journal | Theoretical Computer Science |
Volume | 840 |
Early online date | 27 Aug 2020 |
DOIs | |
Publication status | Published - 6 Nov 2020 |
MoE publication type | A1 Journal article-refereed |
Keywords
- Learning k parities
- Learning sparse parities
- Learning sparse parities with noise
- Mistake bound model
- PAC model