Submission declined on 23 April 2024 by
Stuartyeates (
talk). This needs to start with a compressible explanation of what black-box function inversion is. An explanation grounded not in abstract math but in real-world problems. Then explain the work of Hellman and further work.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
|
Fiat-Naor algorithm [1] [2] is an algorithm proposed by Amos Fiat and Moni Naor for black-box function inversion with preprocessing. In this problem, a computationally unbounded preprocessing algorithm is given a function and outputs an advice of bits. Given the advice and the black-box access to , an online algorithm is asked to invert at some point in time . The goal is to obtain the best tradeoff between and . Fiat-Naor algorithm works for every and satisfying (where the -notation hides a poly-logarithmic factor). This is essentially the best known tradeoff for this problem so far.
The task of black-box function inversion with preprocessing is as follows. In the preprocessing stage, a computationally unbounded preprocessing algorithm is given a function , where denotes the set , and outputs an advice of bits. In the online stage, an online algorithm is given the black-box access to , the advice , and some point in the image of and is asked to find such that in time . Black-box access to allows to query any and obtain .
There are two naive solutions.
Combining two solutions we get an algorithm working for every in general.
Martin Hellman [3] was the first who studied inverting functions with preprocessing. His algorithm, later made rigorous by Fiat and Naor, [1] can invert a random function with high probability over the choice of for every and satisfying . However, it does not work with arbitrary functions.
Andrew Yao [4] proved a lower bound of for function inversion.
Fiat-Naor algorithm inverts arbitrary function at any image with probability over the randomness of preprocessing algorithm . The algorithm works as follows.
The algorithm takes a space parameter and a time parameter , which will be different from the actual space and by some poly-logarithmic factor in , respectively. and decide other parameters as follows, where notation hides constant factors.
is required to be a family of -wise independent hash functions for some , each with a -space description. Functions are chosen in some way such that they are pairwise independent, and the evaluation of these function takes amortized time. Fiat and Naor gave such a construction, in which these functions are not fully independent to achieve the amortized time.
The subroutines in Fiat-Naor algorithm is built on Hellman's algorithm for inverting random function. [3]
The main idea of preprocessing is to build chains of length in the form of and store the starting point and ending point in a loop-up table as the advice. On input , the online algorithm iteratively evaluate . In each step, if it reaches an endpoint of some chain, it immediately jump back to the starting point . If is covered by some chain (not as a starting point), then the algorithm will reach again in step and find the preimage, which is the immediate predecessor of is this chain.
Probability analysis shows that with , we can ensure that there are few enough collisions in chains with randomly chosen starting points, so that these chains cover distinct values. This allows inverting at a -fraction of points.
It remains to amplify the success probability. For this purpose, the above algorithm is not directly applied to , but to re-randomized by a random function . Then by repeating times with independently chosen , every possible input is covered at least once with probability . Hellman's algorithm uses space and (where -notation hides poly-logarithmic factors in ), so is sufficient to let .
Since a random function is unlikely to have a succinct description, Hellman suggested heauristically using some function family . Fiat and Naor made this argument rigorous with their construction of . They also showed that in general, for every function where every image has at most preimages, Hellman's algorithm works if and thus for every and satisfying .
Hellman's algorithm does not obtain a good tradeoff when there are some elements with many preimages, which makes big. This case has to be handled to invert arbitrary functions.
The look-up table is exactly for this purpose. It contains random elements and their images. These images can be inverted by searching in . It remains to handle images not included in , that is function over the remaining domain . Since , with probability , every image of with more than preimages are contained in . Therefore, the remaining images have relatively few preiamges and can be inverted using Hellman's algorithm.
To apply Hellman's algorithm over a partial domain , we use a function to re-randomize . Then function becomes a function inside . Note that hashing to arbitrary subset is difficult. For this reason, is designed as where is the smallest number in such that . The online algorithm has to query to check whether , so it makes at most more queries to evaluate .
Since every image of has at most , setting can guarantee that elements of are covered in each subroutine. Thus repetitions can amplify the success probability for every image to . The actual space and time are and . By setting , it is guaranteed that
The line of research in function inversion has developed practical tools for cryptanalysis, such as Rainbow table. [5] Because of the generality of function inversion, Fiat-Naor algorithm can be applied to other data structure problems such as string searching [6] and 3SUM-Indexing. [7] [8] It is also applied to computational complexity to beat some conjectured lower bounds. [9] [10]
{{
cite journal}}
: Cite journal requires |journal=
(
help)
Submission declined on 23 April 2024 by
Stuartyeates (
talk). This draft's references do not show that the subject
qualifies for a Wikipedia article. In summary, the draft needs multiple published sources that are:
This needs to start with a compressible explanation of what black-box function inversion is. An explanation grounded not in abstract math but in real-world problems. Then explain the work of Hellman and further work.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
|
Fiat-Naor algorithm [1] [2] is an algorithm proposed by Amos Fiat and Moni Naor for black-box function inversion with preprocessing. In this problem, a computationally unbounded preprocessing algorithm is given a function and outputs an advice of bits. Given the advice and the black-box access to , an online algorithm is asked to invert at some point in time . The goal is to obtain the best tradeoff between and . Fiat-Naor algorithm works for every and satisfying (where the -notation hides a poly-logarithmic factor). This is essentially the best known tradeoff for this problem so far.
The task of black-box function inversion with preprocessing is as follows. In the preprocessing stage, a computationally unbounded preprocessing algorithm is given a function , where denotes the set , and outputs an advice of bits. In the online stage, an online algorithm is given the black-box access to , the advice , and some point in the image of and is asked to find such that in time . Black-box access to allows to query any and obtain .
There are two naive solutions.
Combining two solutions we get an algorithm working for every in general.
Martin Hellman [3] was the first who studied inverting functions with preprocessing. His algorithm, later made rigorous by Fiat and Naor, [1] can invert a random function with high probability over the choice of for every and satisfying . However, it does not work with arbitrary functions.
Andrew Yao [4] proved a lower bound of for function inversion.
Fiat-Naor algorithm inverts arbitrary function at any image with probability over the randomness of preprocessing algorithm . The algorithm works as follows.
The algorithm takes a space parameter and a time parameter , which will be different from the actual space and by some poly-logarithmic factor in , respectively. and decide other parameters as follows, where notation hides constant factors.
is required to be a family of -wise independent hash functions for some , each with a -space description. Functions are chosen in some way such that they are pairwise independent, and the evaluation of these function takes amortized time. Fiat and Naor gave such a construction, in which these functions are not fully independent to achieve the amortized time.
The subroutines in Fiat-Naor algorithm is built on Hellman's algorithm for inverting random function. [3]
The main idea of preprocessing is to build chains of length in the form of and store the starting point and ending point in a loop-up table as the advice. On input , the online algorithm iteratively evaluate . In each step, if it reaches an endpoint of some chain, it immediately jump back to the starting point . If is covered by some chain (not as a starting point), then the algorithm will reach again in step and find the preimage, which is the immediate predecessor of is this chain.
Probability analysis shows that with , we can ensure that there are few enough collisions in chains with randomly chosen starting points, so that these chains cover distinct values. This allows inverting at a -fraction of points.
It remains to amplify the success probability. For this purpose, the above algorithm is not directly applied to , but to re-randomized by a random function . Then by repeating times with independently chosen , every possible input is covered at least once with probability . Hellman's algorithm uses space and (where -notation hides poly-logarithmic factors in ), so is sufficient to let .
Since a random function is unlikely to have a succinct description, Hellman suggested heauristically using some function family . Fiat and Naor made this argument rigorous with their construction of . They also showed that in general, for every function where every image has at most preimages, Hellman's algorithm works if and thus for every and satisfying .
Hellman's algorithm does not obtain a good tradeoff when there are some elements with many preimages, which makes big. This case has to be handled to invert arbitrary functions.
The look-up table is exactly for this purpose. It contains random elements and their images. These images can be inverted by searching in . It remains to handle images not included in , that is function over the remaining domain . Since , with probability , every image of with more than preimages are contained in . Therefore, the remaining images have relatively few preiamges and can be inverted using Hellman's algorithm.
To apply Hellman's algorithm over a partial domain , we use a function to re-randomize . Then function becomes a function inside . Note that hashing to arbitrary subset is difficult. For this reason, is designed as where is the smallest number in such that . The online algorithm has to query to check whether , so it makes at most more queries to evaluate .
Since every image of has at most , setting can guarantee that elements of are covered in each subroutine. Thus repetitions can amplify the success probability for every image to . The actual space and time are and . By setting , it is guaranteed that
The line of research in function inversion has developed practical tools for cryptanalysis, such as Rainbow table. [5] Because of the generality of function inversion, Fiat-Naor algorithm can be applied to other data structure problems such as string searching [6] and 3SUM-Indexing. [7] [8] It is also applied to computational complexity to beat some conjectured lower bounds. [9] [10]
{{
cite journal}}
: Cite journal requires |journal=
(
help)
-
in-depth (not just passing mentions about the subject)
-
reliable
-
secondary
-
independent of the subject
Make sure you add references that meet these criteria before resubmitting. Learn about mistakes to avoid when addressing this issue. If no additional references exist, the subject is not suitable for Wikipedia.