performance - NP-complete, no efficient algorithm? -
I do not have much information about NP-completion, but read more about it here and here. The introduction of the book of algorithm, I am reading (myself), "Although no efficient algorithm has ever been found for the NP-complete problem, nobody has ever been proved that an efficient algorithm for someone is not in existence. " I'm just thinking, how does it know that they do not have any algorithm, which is not the most efficient ... if it's all the efficient set of algorithms? Thank you, Sam Great question I'm curious I will answer the question, but this is my endeavor. It is my endeavor. It is astonishing to know if a better algorithm exists, on which you come (if you know, then your algorithm will be better). The question arises when you should stop trying to see better algorithms. The primary approach involves coming with a lower limit of the problem, and then gradually makes them stronger and stronger. There is a good start to investigate this problem and to think about the standard comparison based sorting problem, we have a list of n elements Given and want to sort it. So the worst algorithm has come up with all the n! Listings, and check that has been sorted. Then another intuitive approach is to use something like bubble sort, which is O (n ^ 2). We wonder if we can still do better. It is good to say that we win division and victory and we come with merge sort which is O (n log n). Now we are interested in knowing that merge sort is not the most efficient. So we spend more time thinking of a better algorithm but can not come together, so we become frustrated, and switch our approach and think about proving that O Log n) can not be a better comparison Now, with ease, a lower lower limit is o (n), just to sort the list, we need at least the time to read it (n) to read. But, we try to improve this lower bound. See if you can come with an improvement in the lower bound of O (n log n) if you have not seen it before. If you are not able to check this great article it proves that N log N is comparatively less: Now we start thinking about NP's full problems on the top armor problem Consider (There is a set of Kashmir in a graph STT. Every edge has an event for at least one top) which is full of NP. We come up with the most intuitive brute force method to solve it (all by selecting K) Top options and testing of each possible solution) Now the question is, is there something more efficient in existence? Well, after a lot of effort, suppose we can not come fast anything. So we try less approach just like before and improve them better. O (n) is clearly a lower bound, but we can not prove the time of O (N Kashmir) less force (if we have proven such a lower limit, then that animal should solve the upper part The best way is). So we take a break, and work on other problems. Then one day we are working on the problem of maximum independent set on the graph (there is a set S in Kashmir's Kashmir which is not adjacent to any two corners in S). We come with a brutal force solution, but we want to know that this is not the most effective algorithm. Although we can not do anything better and we can not come with a tight lower bound, so we can not say that something is present fast. After several days we see that these problems are really the same, in the sense that a skilled algorithm for one gives a skilled algorithm for another: Do not know if we have the algorithm most efficient for armor cover or independent set, then we can compare the problems relative to the hardness of each other, so that if we search for a good algorithm for one We can apply to the other problems. Basically this boil down from the Feyman view: Imagine that you are a researcher
TL; DR
Michael Elkan