The gradient-free optimization has a few properties that can be advantageous that we wanted to explore. ![]() Possibly, other approaches such as the the hyperNEAT algorithm (Refs 21 and 30 in the manuscript) are of interest, since they can optimize large networks (which NEAT is not very good at). A better approach to scaling, as the referee suggests, would then be to combine small decoder networks with more easily scalable decoders as in Meinerz et al. ![]() The aim of our manuscript was not to provide an algorithm that scales to much larger systems, though indeed smaller networks (fewer weights) might be beneficial. We thank the referee for raising these points, which we have used to improve our manuscript, and for putting our work into perspective. Referee #2 asks on-point questions that we will elaborate upon here (and in the updated manuscript). Will this be possible? Do you think the end result can be better than this work? I feel a good way to scale-up might be concatenate your neural decoders with another easily scalable decoder. A related question is, since NEAT doesn't need gradient optimization, have you considered to use some other machine learning models such as decision trees instead of neural networks?Ĥ. Why is the gradient-free aspect of NEAT considered as an advantage? In the recent years, a main reason to use neural networks is that they work so well together with gradient optimization.ģ. Can you comment how Neat compared to some newer methods in AutoML?Ģ. It also cares about finding neural network structures automatically. In recent years, there are a lot of works on AutoML (, ). ![]() Please at least add some comments about question 1 and 2 in the paper.ġ. (I'm not a good judge of writing and formatting, so I won't comment on these aspects) I find it satisfies the acceptance criteria. This work demonstrates that it is possible to automatically find smaller networks to decode surface codes compared to manually designed ones.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |