Fastest algorithm to decide whether a (always halting) TM accepts a general stringUndecidability of a restricted version of the acceptance problemIs the language of TMs that halt on some string recognizable?The language of TMs accepting some word starting with 101Showing that deciding whether a given TM accepts a word of length 5 is undecidableProof by Reduction: From Empty Language to Halting Problem on Empty InputClassify the set of all TMs whose languages from the accepting problemDecidability of the language of DFAs accepting only odd-length stringsUsing reduction to prove that a given language is not recursively enumerableA language which is neither r.e. nor co-r.eDecide if a string is in a language without simulating the automata accepting the languge

Multi tool use
Multi tool use

Why doesn't using multiple commands with a || or && conditional work?

Im going to France and my passport expires June 19th

What mechanic is there to disable a threat instead of killing it?

Forgetting the musical notes while performing in concert

Extract rows of a table, that include less than x NULLs

How do conventional missiles fly?

Can a virus destroy the BIOS of a modern computer?

GFCI outlets - can they be repaired? Are they really needed at the end of a circuit?

Mathematica command that allows it to read my intentions

How writing a dominant 7 sus4 chord in RNA ( Vsus7 chord in the 1st inversion)

Apex Framework / library for consuming REST services

How could indestructible materials be used in power generation?

Is there a hemisphere-neutral way of specifying a season?

How can I deal with my CEO asking me to hire someone with a higher salary than me, a co-founder?

How badly should I try to prevent a user from XSSing themselves?

What are some good books on Machine Learning and AI like Krugman, Wells and Graddy's "Essentials of Economics"

Is it inappropriate for a student to attend their mentor's dissertation defense?

Method Does Not Exist error message

How would I stat a creature to be immune to everything but the Magic Missile spell? (just for fun)

What killed these X2 caps?

Why is it a bad idea to hire a hitman to eliminate most corrupt politicians?

How to prevent "they're falling in love" trope

Is it possible to create a QR code using text?

Which is the best way to check return result?



Fastest algorithm to decide whether a (always halting) TM accepts a general string


Undecidability of a restricted version of the acceptance problemIs the language of TMs that halt on some string recognizable?The language of TMs accepting some word starting with 101Showing that deciding whether a given TM accepts a word of length 5 is undecidableProof by Reduction: From Empty Language to Halting Problem on Empty InputClassify the set of all TMs whose languages from the accepting problemDecidability of the language of DFAs accepting only odd-length stringsUsing reduction to prove that a given language is not recursively enumerableA language which is neither r.e. nor co-r.eDecide if a string is in a language without simulating the automata accepting the languge













5












$begingroup$


Given a TM $M$ that halts on all inputs, and a general string $w$, consider the most trivial algorithm (Call it $A$) to decide whether $M$ accepts $w$:



$A$ simply simulates $M$ on $w$ and answer what $M$ answers.



The question here is, can this be proven to be the fastest algorithm to do the job?



(I mean, it's quite clear there could not be a faster one. Or could it?)



And more formally and clear:



Is there an algorithm $A'$, that for every input $langle M,wrangle$ satisfies:



  1. If $M$ is a TM that halts on all inputs, $A'$ will return what $M$ returns with input $w$.


  2. $A'$ is faster than $A$.










share|cite|improve this question











$endgroup$







  • 2




    $begingroup$
    There are (theoretically) infinitely many algorithms faster than that, each faster than the previous one. See, for example, this.
    $endgroup$
    – dkaeae
    2 days ago










  • $begingroup$
    @dkaeae Does that mean we can infinitely make any algorithm faster?
    $endgroup$
    – FireCubez
    2 days ago






  • 1




    $begingroup$
    @FireCubez In the technical sense of TMs, and for a particular meaning of infinity, yes. In the sense of algorithms running on real computers, no.
    $endgroup$
    – rlms
    2 days ago
















5












$begingroup$


Given a TM $M$ that halts on all inputs, and a general string $w$, consider the most trivial algorithm (Call it $A$) to decide whether $M$ accepts $w$:



$A$ simply simulates $M$ on $w$ and answer what $M$ answers.



The question here is, can this be proven to be the fastest algorithm to do the job?



(I mean, it's quite clear there could not be a faster one. Or could it?)



And more formally and clear:



Is there an algorithm $A'$, that for every input $langle M,wrangle$ satisfies:



  1. If $M$ is a TM that halts on all inputs, $A'$ will return what $M$ returns with input $w$.


  2. $A'$ is faster than $A$.










share|cite|improve this question











$endgroup$







  • 2




    $begingroup$
    There are (theoretically) infinitely many algorithms faster than that, each faster than the previous one. See, for example, this.
    $endgroup$
    – dkaeae
    2 days ago










  • $begingroup$
    @dkaeae Does that mean we can infinitely make any algorithm faster?
    $endgroup$
    – FireCubez
    2 days ago






  • 1




    $begingroup$
    @FireCubez In the technical sense of TMs, and for a particular meaning of infinity, yes. In the sense of algorithms running on real computers, no.
    $endgroup$
    – rlms
    2 days ago














5












5








5





$begingroup$


Given a TM $M$ that halts on all inputs, and a general string $w$, consider the most trivial algorithm (Call it $A$) to decide whether $M$ accepts $w$:



$A$ simply simulates $M$ on $w$ and answer what $M$ answers.



The question here is, can this be proven to be the fastest algorithm to do the job?



(I mean, it's quite clear there could not be a faster one. Or could it?)



And more formally and clear:



Is there an algorithm $A'$, that for every input $langle M,wrangle$ satisfies:



  1. If $M$ is a TM that halts on all inputs, $A'$ will return what $M$ returns with input $w$.


  2. $A'$ is faster than $A$.










share|cite|improve this question











$endgroup$




Given a TM $M$ that halts on all inputs, and a general string $w$, consider the most trivial algorithm (Call it $A$) to decide whether $M$ accepts $w$:



$A$ simply simulates $M$ on $w$ and answer what $M$ answers.



The question here is, can this be proven to be the fastest algorithm to do the job?



(I mean, it's quite clear there could not be a faster one. Or could it?)



And more formally and clear:



Is there an algorithm $A'$, that for every input $langle M,wrangle$ satisfies:



  1. If $M$ is a TM that halts on all inputs, $A'$ will return what $M$ returns with input $w$.


  2. $A'$ is faster than $A$.







turing-machines time-complexity






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 20 hours ago









xskxzr

4,08921033




4,08921033










asked 2 days ago









OrenOren

375




375







  • 2




    $begingroup$
    There are (theoretically) infinitely many algorithms faster than that, each faster than the previous one. See, for example, this.
    $endgroup$
    – dkaeae
    2 days ago










  • $begingroup$
    @dkaeae Does that mean we can infinitely make any algorithm faster?
    $endgroup$
    – FireCubez
    2 days ago






  • 1




    $begingroup$
    @FireCubez In the technical sense of TMs, and for a particular meaning of infinity, yes. In the sense of algorithms running on real computers, no.
    $endgroup$
    – rlms
    2 days ago













  • 2




    $begingroup$
    There are (theoretically) infinitely many algorithms faster than that, each faster than the previous one. See, for example, this.
    $endgroup$
    – dkaeae
    2 days ago










  • $begingroup$
    @dkaeae Does that mean we can infinitely make any algorithm faster?
    $endgroup$
    – FireCubez
    2 days ago






  • 1




    $begingroup$
    @FireCubez In the technical sense of TMs, and for a particular meaning of infinity, yes. In the sense of algorithms running on real computers, no.
    $endgroup$
    – rlms
    2 days ago








2




2




$begingroup$
There are (theoretically) infinitely many algorithms faster than that, each faster than the previous one. See, for example, this.
$endgroup$
– dkaeae
2 days ago




$begingroup$
There are (theoretically) infinitely many algorithms faster than that, each faster than the previous one. See, for example, this.
$endgroup$
– dkaeae
2 days ago












$begingroup$
@dkaeae Does that mean we can infinitely make any algorithm faster?
$endgroup$
– FireCubez
2 days ago




$begingroup$
@dkaeae Does that mean we can infinitely make any algorithm faster?
$endgroup$
– FireCubez
2 days ago




1




1




$begingroup$
@FireCubez In the technical sense of TMs, and for a particular meaning of infinity, yes. In the sense of algorithms running on real computers, no.
$endgroup$
– rlms
2 days ago





$begingroup$
@FireCubez In the technical sense of TMs, and for a particular meaning of infinity, yes. In the sense of algorithms running on real computers, no.
$endgroup$
– rlms
2 days ago











3 Answers
3






active

oldest

votes


















3












$begingroup$


Is there an algorithm $A′$, that for every input $langle M,wrangle$ satisfies:



1) If $M$ is a TM that halts on all inputs, $A′$ will return what $M$ returns with input $w$.



2) $A′$ is faster than $A$ (In worst case terms)




It's not possible to be asymptotically faster by more than a log factor. By the time hierarchy theorem, for any reasonable function $f$, there are problems that can be solved in $f(n)$ steps that cannot be solved in $o(f(n)/log n)$ steps.



Other answers point out that you can get faster by any constant factor by the linear speedup theorem which, roughly speaking, simulates a factor of $c$ faster by simulating $c$ steps of the Turing machine's operation at once.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
    $endgroup$
    – Oren
    2 days ago











  • $begingroup$
    @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
    $endgroup$
    – David Richerby
    2 days ago











  • $begingroup$
    hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
    $endgroup$
    – Oren
    yesterday










  • $begingroup$
    As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
    $endgroup$
    – David Richerby
    yesterday










  • $begingroup$
    Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
    $endgroup$
    – Oren
    yesterday



















4












$begingroup$

Dkaeae brought up a very useful trick in the comments: the Linear Speedup Theorem. Effectively, it says:




For any positive $k$, there's a mechanical transformation you can do to any Turing machine, which makes it run $k$ times faster.




(There's a bit more to it than that, but that's not really relevant here. Wikipedia has more details.)



So I propose the following family of algorithms (with hyperparameter $k$):



def decide(M, w):
use the Linear Speedup Theorem to turn M into M', which is k times faster
run M' on w and return the result


You can make this as fast as you want by increasing $k$: there's theoretically no limit on this. No matter how fast it runs, you can always make it faster by just making $k$ bigger.



This is why time complexity is always given in asymptotic terms (big-O and all that): constant factors are extremely easy to add and remove, so they don't really tell us anything useful about the algorithm itself. If you have an algorithm that runs in $n^5+C$ time, I can turn that into $frac11,000,000 n^5+C$, but it'll still end up slower than $1,000,000n+C$ for large enough $n$.



P.S. You might be wondering, "what's the catch?" The answer is, the Linear Speedup construction makes a machine with more states and a more convoluted instruction set. But that doesn't matter when you're talking about time complexity.






share|cite|improve this answer











$endgroup$




















    0












    $begingroup$

    Of course there is.



    Consider, for instance, a TM $T$ which reads its entire input (of length $n$) $10^100n$ times and then accepts. Then the TM $T'$ which instantly accepts any input is at least $10^100n$ times faster than any (step for step) simulation of $T$. (You may replace $10^100n$ with your favorite largest computable number.)



    Hence, the following algorithm $A'$ would do it:



    1. Check whether $langle M rangle = langle T rangle$. If so, then set $langle M rangle$ to $langle T' rangle$; otherwise, leave $langle M rangle$ intact.

    2. Do what $A$ does.

    It is easy to see $A'$ will now be $10^100n$ faster than $A$ if given $langle T, w rangle$ as input. This qualifies as a (strict) asymptotic improvement since there are infinitely many values for $w$. $A'$ only needs $O(n)$ extra steps (in step 1) before doing what $A$ does, but $A$ takes $Omega(n)$ time anyway (because it necessarily reads its entire input at least once), so $A'$ is asymptotically just as fast as $A$ on all other inputs.



    The above construction provides an improvement for one particular TM (i.e., $T$) but can be extended to be faster than $A$ for infinitely many TMs. This can be done, for instance, by defining a series $T_k$ of TMs with a parameter $k$ such that $T_k$ reads its entire input $k^100n$ times and accepts. The description of $T_k$ can be made such that it is recognizable by $A'$ in $O(n)$ time, as above (imagine, for instance, $langle T_k rangle$ being the exact same piece of code where $k$ is declared as a constant).






    share|cite|improve this answer











    $endgroup$












    • $begingroup$
      I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
      $endgroup$
      – Oren
      2 days ago










    • $begingroup$
      If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
      $endgroup$
      – dkaeae
      2 days ago











    • $begingroup$
      I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
      $endgroup$
      – Oren
      2 days ago










    • $begingroup$
      $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
      $endgroup$
      – dkaeae
      2 days ago






    • 1




      $begingroup$
      Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
      $endgroup$
      – Oren
      2 days ago












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "419"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcs.stackexchange.com%2fquestions%2f106329%2ffastest-algorithm-to-decide-whether-a-always-halting-tm-accepts-a-general-stri%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$


    Is there an algorithm $A′$, that for every input $langle M,wrangle$ satisfies:



    1) If $M$ is a TM that halts on all inputs, $A′$ will return what $M$ returns with input $w$.



    2) $A′$ is faster than $A$ (In worst case terms)




    It's not possible to be asymptotically faster by more than a log factor. By the time hierarchy theorem, for any reasonable function $f$, there are problems that can be solved in $f(n)$ steps that cannot be solved in $o(f(n)/log n)$ steps.



    Other answers point out that you can get faster by any constant factor by the linear speedup theorem which, roughly speaking, simulates a factor of $c$ faster by simulating $c$ steps of the Turing machine's operation at once.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
      $endgroup$
      – Oren
      2 days ago











    • $begingroup$
      @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
      $endgroup$
      – David Richerby
      2 days ago











    • $begingroup$
      hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
      $endgroup$
      – Oren
      yesterday










    • $begingroup$
      As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
      $endgroup$
      – David Richerby
      yesterday










    • $begingroup$
      Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
      $endgroup$
      – Oren
      yesterday
















    3












    $begingroup$


    Is there an algorithm $A′$, that for every input $langle M,wrangle$ satisfies:



    1) If $M$ is a TM that halts on all inputs, $A′$ will return what $M$ returns with input $w$.



    2) $A′$ is faster than $A$ (In worst case terms)




    It's not possible to be asymptotically faster by more than a log factor. By the time hierarchy theorem, for any reasonable function $f$, there are problems that can be solved in $f(n)$ steps that cannot be solved in $o(f(n)/log n)$ steps.



    Other answers point out that you can get faster by any constant factor by the linear speedup theorem which, roughly speaking, simulates a factor of $c$ faster by simulating $c$ steps of the Turing machine's operation at once.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
      $endgroup$
      – Oren
      2 days ago











    • $begingroup$
      @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
      $endgroup$
      – David Richerby
      2 days ago











    • $begingroup$
      hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
      $endgroup$
      – Oren
      yesterday










    • $begingroup$
      As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
      $endgroup$
      – David Richerby
      yesterday










    • $begingroup$
      Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
      $endgroup$
      – Oren
      yesterday














    3












    3








    3





    $begingroup$


    Is there an algorithm $A′$, that for every input $langle M,wrangle$ satisfies:



    1) If $M$ is a TM that halts on all inputs, $A′$ will return what $M$ returns with input $w$.



    2) $A′$ is faster than $A$ (In worst case terms)




    It's not possible to be asymptotically faster by more than a log factor. By the time hierarchy theorem, for any reasonable function $f$, there are problems that can be solved in $f(n)$ steps that cannot be solved in $o(f(n)/log n)$ steps.



    Other answers point out that you can get faster by any constant factor by the linear speedup theorem which, roughly speaking, simulates a factor of $c$ faster by simulating $c$ steps of the Turing machine's operation at once.






    share|cite|improve this answer









    $endgroup$




    Is there an algorithm $A′$, that for every input $langle M,wrangle$ satisfies:



    1) If $M$ is a TM that halts on all inputs, $A′$ will return what $M$ returns with input $w$.



    2) $A′$ is faster than $A$ (In worst case terms)




    It's not possible to be asymptotically faster by more than a log factor. By the time hierarchy theorem, for any reasonable function $f$, there are problems that can be solved in $f(n)$ steps that cannot be solved in $o(f(n)/log n)$ steps.



    Other answers point out that you can get faster by any constant factor by the linear speedup theorem which, roughly speaking, simulates a factor of $c$ faster by simulating $c$ steps of the Turing machine's operation at once.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered 2 days ago









    David RicherbyDavid Richerby

    69.6k15106195




    69.6k15106195











    • $begingroup$
      So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
      $endgroup$
      – Oren
      2 days ago











    • $begingroup$
      @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
      $endgroup$
      – David Richerby
      2 days ago











    • $begingroup$
      hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
      $endgroup$
      – Oren
      yesterday










    • $begingroup$
      As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
      $endgroup$
      – David Richerby
      yesterday










    • $begingroup$
      Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
      $endgroup$
      – Oren
      yesterday

















    • $begingroup$
      So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
      $endgroup$
      – Oren
      2 days ago











    • $begingroup$
      @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
      $endgroup$
      – David Richerby
      2 days ago











    • $begingroup$
      hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
      $endgroup$
      – Oren
      yesterday










    • $begingroup$
      As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
      $endgroup$
      – David Richerby
      yesterday










    • $begingroup$
      Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
      $endgroup$
      – Oren
      yesterday
















    $begingroup$
    So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
    $endgroup$
    – Oren
    2 days ago





    $begingroup$
    So for input M that runs in exponential time in worst case, does this mean that the (asymptotically) fastest algorithm (or family of algorithms) to improve A must be exponential two in worst case on the set of inputs that include this M with general string $w$?
    $endgroup$
    – Oren
    2 days ago













    $begingroup$
    @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
    $endgroup$
    – David Richerby
    2 days ago





    $begingroup$
    @Oren Exactly, yes. In particular, this is how we know that $mathrmEXPneqmathrmP$: it tells us there can be no polynomial-time algorithm for an $mathrmEXP$-complete problem.
    $endgroup$
    – David Richerby
    2 days ago













    $begingroup$
    hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
    $endgroup$
    – Oren
    yesterday




    $begingroup$
    hey can you extend of the use of the time hierarchy theorem? It is still unclear to me as to why the specific algorithm can be reduced only by log factor as the theorem states only that exists such algorithms, though it doesn't follow immediately that A is one of them (those who can be improved only by a log factor)
    $endgroup$
    – Oren
    yesterday












    $begingroup$
    As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
    $endgroup$
    – David Richerby
    yesterday




    $begingroup$
    As I recall, we believe that the time hierarchy theorem ought to be strict, in the sense that there are things you can do in time $f(n)$ that can't be done in time $o(f(n))$ (analogous to the space hierarchy theorem), but that $o(f(n)/log n)$ is the best anyone's managed to prove. You can't keep shaving off log-factors since, if you managed to do it even twice, you'd be at roughly $f(n)/(log n)^2$, which is less than $f(n)/log n$.
    $endgroup$
    – David Richerby
    yesterday












    $begingroup$
    Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
    $endgroup$
    – Oren
    yesterday





    $begingroup$
    Ok. still why is $A$ one of those algorithms (those that you can do in time $f(n)$ but not in time $o(f(n)/logn)$)?
    $endgroup$
    – Oren
    yesterday












    4












    $begingroup$

    Dkaeae brought up a very useful trick in the comments: the Linear Speedup Theorem. Effectively, it says:




    For any positive $k$, there's a mechanical transformation you can do to any Turing machine, which makes it run $k$ times faster.




    (There's a bit more to it than that, but that's not really relevant here. Wikipedia has more details.)



    So I propose the following family of algorithms (with hyperparameter $k$):



    def decide(M, w):
    use the Linear Speedup Theorem to turn M into M', which is k times faster
    run M' on w and return the result


    You can make this as fast as you want by increasing $k$: there's theoretically no limit on this. No matter how fast it runs, you can always make it faster by just making $k$ bigger.



    This is why time complexity is always given in asymptotic terms (big-O and all that): constant factors are extremely easy to add and remove, so they don't really tell us anything useful about the algorithm itself. If you have an algorithm that runs in $n^5+C$ time, I can turn that into $frac11,000,000 n^5+C$, but it'll still end up slower than $1,000,000n+C$ for large enough $n$.



    P.S. You might be wondering, "what's the catch?" The answer is, the Linear Speedup construction makes a machine with more states and a more convoluted instruction set. But that doesn't matter when you're talking about time complexity.






    share|cite|improve this answer











    $endgroup$

















      4












      $begingroup$

      Dkaeae brought up a very useful trick in the comments: the Linear Speedup Theorem. Effectively, it says:




      For any positive $k$, there's a mechanical transformation you can do to any Turing machine, which makes it run $k$ times faster.




      (There's a bit more to it than that, but that's not really relevant here. Wikipedia has more details.)



      So I propose the following family of algorithms (with hyperparameter $k$):



      def decide(M, w):
      use the Linear Speedup Theorem to turn M into M', which is k times faster
      run M' on w and return the result


      You can make this as fast as you want by increasing $k$: there's theoretically no limit on this. No matter how fast it runs, you can always make it faster by just making $k$ bigger.



      This is why time complexity is always given in asymptotic terms (big-O and all that): constant factors are extremely easy to add and remove, so they don't really tell us anything useful about the algorithm itself. If you have an algorithm that runs in $n^5+C$ time, I can turn that into $frac11,000,000 n^5+C$, but it'll still end up slower than $1,000,000n+C$ for large enough $n$.



      P.S. You might be wondering, "what's the catch?" The answer is, the Linear Speedup construction makes a machine with more states and a more convoluted instruction set. But that doesn't matter when you're talking about time complexity.






      share|cite|improve this answer











      $endgroup$















        4












        4








        4





        $begingroup$

        Dkaeae brought up a very useful trick in the comments: the Linear Speedup Theorem. Effectively, it says:




        For any positive $k$, there's a mechanical transformation you can do to any Turing machine, which makes it run $k$ times faster.




        (There's a bit more to it than that, but that's not really relevant here. Wikipedia has more details.)



        So I propose the following family of algorithms (with hyperparameter $k$):



        def decide(M, w):
        use the Linear Speedup Theorem to turn M into M', which is k times faster
        run M' on w and return the result


        You can make this as fast as you want by increasing $k$: there's theoretically no limit on this. No matter how fast it runs, you can always make it faster by just making $k$ bigger.



        This is why time complexity is always given in asymptotic terms (big-O and all that): constant factors are extremely easy to add and remove, so they don't really tell us anything useful about the algorithm itself. If you have an algorithm that runs in $n^5+C$ time, I can turn that into $frac11,000,000 n^5+C$, but it'll still end up slower than $1,000,000n+C$ for large enough $n$.



        P.S. You might be wondering, "what's the catch?" The answer is, the Linear Speedup construction makes a machine with more states and a more convoluted instruction set. But that doesn't matter when you're talking about time complexity.






        share|cite|improve this answer











        $endgroup$



        Dkaeae brought up a very useful trick in the comments: the Linear Speedup Theorem. Effectively, it says:




        For any positive $k$, there's a mechanical transformation you can do to any Turing machine, which makes it run $k$ times faster.




        (There's a bit more to it than that, but that's not really relevant here. Wikipedia has more details.)



        So I propose the following family of algorithms (with hyperparameter $k$):



        def decide(M, w):
        use the Linear Speedup Theorem to turn M into M', which is k times faster
        run M' on w and return the result


        You can make this as fast as you want by increasing $k$: there's theoretically no limit on this. No matter how fast it runs, you can always make it faster by just making $k$ bigger.



        This is why time complexity is always given in asymptotic terms (big-O and all that): constant factors are extremely easy to add and remove, so they don't really tell us anything useful about the algorithm itself. If you have an algorithm that runs in $n^5+C$ time, I can turn that into $frac11,000,000 n^5+C$, but it'll still end up slower than $1,000,000n+C$ for large enough $n$.



        P.S. You might be wondering, "what's the catch?" The answer is, the Linear Speedup construction makes a machine with more states and a more convoluted instruction set. But that doesn't matter when you're talking about time complexity.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited yesterday

























        answered 2 days ago









        DraconisDraconis

        5,722821




        5,722821





















            0












            $begingroup$

            Of course there is.



            Consider, for instance, a TM $T$ which reads its entire input (of length $n$) $10^100n$ times and then accepts. Then the TM $T'$ which instantly accepts any input is at least $10^100n$ times faster than any (step for step) simulation of $T$. (You may replace $10^100n$ with your favorite largest computable number.)



            Hence, the following algorithm $A'$ would do it:



            1. Check whether $langle M rangle = langle T rangle$. If so, then set $langle M rangle$ to $langle T' rangle$; otherwise, leave $langle M rangle$ intact.

            2. Do what $A$ does.

            It is easy to see $A'$ will now be $10^100n$ faster than $A$ if given $langle T, w rangle$ as input. This qualifies as a (strict) asymptotic improvement since there are infinitely many values for $w$. $A'$ only needs $O(n)$ extra steps (in step 1) before doing what $A$ does, but $A$ takes $Omega(n)$ time anyway (because it necessarily reads its entire input at least once), so $A'$ is asymptotically just as fast as $A$ on all other inputs.



            The above construction provides an improvement for one particular TM (i.e., $T$) but can be extended to be faster than $A$ for infinitely many TMs. This can be done, for instance, by defining a series $T_k$ of TMs with a parameter $k$ such that $T_k$ reads its entire input $k^100n$ times and accepts. The description of $T_k$ can be made such that it is recognizable by $A'$ in $O(n)$ time, as above (imagine, for instance, $langle T_k rangle$ being the exact same piece of code where $k$ is declared as a constant).






            share|cite|improve this answer











            $endgroup$












            • $begingroup$
              I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
              $endgroup$
              – dkaeae
              2 days ago











            • $begingroup$
              I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
              $endgroup$
              – dkaeae
              2 days ago






            • 1




              $begingroup$
              Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
              $endgroup$
              – Oren
              2 days ago
















            0












            $begingroup$

            Of course there is.



            Consider, for instance, a TM $T$ which reads its entire input (of length $n$) $10^100n$ times and then accepts. Then the TM $T'$ which instantly accepts any input is at least $10^100n$ times faster than any (step for step) simulation of $T$. (You may replace $10^100n$ with your favorite largest computable number.)



            Hence, the following algorithm $A'$ would do it:



            1. Check whether $langle M rangle = langle T rangle$. If so, then set $langle M rangle$ to $langle T' rangle$; otherwise, leave $langle M rangle$ intact.

            2. Do what $A$ does.

            It is easy to see $A'$ will now be $10^100n$ faster than $A$ if given $langle T, w rangle$ as input. This qualifies as a (strict) asymptotic improvement since there are infinitely many values for $w$. $A'$ only needs $O(n)$ extra steps (in step 1) before doing what $A$ does, but $A$ takes $Omega(n)$ time anyway (because it necessarily reads its entire input at least once), so $A'$ is asymptotically just as fast as $A$ on all other inputs.



            The above construction provides an improvement for one particular TM (i.e., $T$) but can be extended to be faster than $A$ for infinitely many TMs. This can be done, for instance, by defining a series $T_k$ of TMs with a parameter $k$ such that $T_k$ reads its entire input $k^100n$ times and accepts. The description of $T_k$ can be made such that it is recognizable by $A'$ in $O(n)$ time, as above (imagine, for instance, $langle T_k rangle$ being the exact same piece of code where $k$ is declared as a constant).






            share|cite|improve this answer











            $endgroup$












            • $begingroup$
              I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
              $endgroup$
              – dkaeae
              2 days ago











            • $begingroup$
              I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
              $endgroup$
              – dkaeae
              2 days ago






            • 1




              $begingroup$
              Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
              $endgroup$
              – Oren
              2 days ago














            0












            0








            0





            $begingroup$

            Of course there is.



            Consider, for instance, a TM $T$ which reads its entire input (of length $n$) $10^100n$ times and then accepts. Then the TM $T'$ which instantly accepts any input is at least $10^100n$ times faster than any (step for step) simulation of $T$. (You may replace $10^100n$ with your favorite largest computable number.)



            Hence, the following algorithm $A'$ would do it:



            1. Check whether $langle M rangle = langle T rangle$. If so, then set $langle M rangle$ to $langle T' rangle$; otherwise, leave $langle M rangle$ intact.

            2. Do what $A$ does.

            It is easy to see $A'$ will now be $10^100n$ faster than $A$ if given $langle T, w rangle$ as input. This qualifies as a (strict) asymptotic improvement since there are infinitely many values for $w$. $A'$ only needs $O(n)$ extra steps (in step 1) before doing what $A$ does, but $A$ takes $Omega(n)$ time anyway (because it necessarily reads its entire input at least once), so $A'$ is asymptotically just as fast as $A$ on all other inputs.



            The above construction provides an improvement for one particular TM (i.e., $T$) but can be extended to be faster than $A$ for infinitely many TMs. This can be done, for instance, by defining a series $T_k$ of TMs with a parameter $k$ such that $T_k$ reads its entire input $k^100n$ times and accepts. The description of $T_k$ can be made such that it is recognizable by $A'$ in $O(n)$ time, as above (imagine, for instance, $langle T_k rangle$ being the exact same piece of code where $k$ is declared as a constant).






            share|cite|improve this answer











            $endgroup$



            Of course there is.



            Consider, for instance, a TM $T$ which reads its entire input (of length $n$) $10^100n$ times and then accepts. Then the TM $T'$ which instantly accepts any input is at least $10^100n$ times faster than any (step for step) simulation of $T$. (You may replace $10^100n$ with your favorite largest computable number.)



            Hence, the following algorithm $A'$ would do it:



            1. Check whether $langle M rangle = langle T rangle$. If so, then set $langle M rangle$ to $langle T' rangle$; otherwise, leave $langle M rangle$ intact.

            2. Do what $A$ does.

            It is easy to see $A'$ will now be $10^100n$ faster than $A$ if given $langle T, w rangle$ as input. This qualifies as a (strict) asymptotic improvement since there are infinitely many values for $w$. $A'$ only needs $O(n)$ extra steps (in step 1) before doing what $A$ does, but $A$ takes $Omega(n)$ time anyway (because it necessarily reads its entire input at least once), so $A'$ is asymptotically just as fast as $A$ on all other inputs.



            The above construction provides an improvement for one particular TM (i.e., $T$) but can be extended to be faster than $A$ for infinitely many TMs. This can be done, for instance, by defining a series $T_k$ of TMs with a parameter $k$ such that $T_k$ reads its entire input $k^100n$ times and accepts. The description of $T_k$ can be made such that it is recognizable by $A'$ in $O(n)$ time, as above (imagine, for instance, $langle T_k rangle$ being the exact same piece of code where $k$ is declared as a constant).







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited yesterday

























            answered 2 days ago









            dkaeaedkaeae

            2,2221922




            2,2221922











            • $begingroup$
              I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
              $endgroup$
              – dkaeae
              2 days ago











            • $begingroup$
              I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
              $endgroup$
              – dkaeae
              2 days ago






            • 1




              $begingroup$
              Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
              $endgroup$
              – Oren
              2 days ago

















            • $begingroup$
              I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
              $endgroup$
              – dkaeae
              2 days ago











            • $begingroup$
              I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
              $endgroup$
              – Oren
              2 days ago










            • $begingroup$
              $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
              $endgroup$
              – dkaeae
              2 days ago






            • 1




              $begingroup$
              Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
              $endgroup$
              – Oren
              2 days ago
















            $begingroup$
            I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
            $endgroup$
            – Oren
            2 days ago




            $begingroup$
            I get the answer from the comment above, really cool by the way, i didn't know that.. but this example is for a specific TM. I mean if $T'$ gets input of, for example, a TM that reject all inputs immediately, it won't work. or am I getting this wrong?
            $endgroup$
            – Oren
            2 days ago












            $begingroup$
            If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
            $endgroup$
            – dkaeae
            2 days ago





            $begingroup$
            If you are trying to prove a statement of the form $forall x: A(x)$ wrong, then you only need to provide an $x$ which falsifies $A(x)$. (Here, $x$ is $T$ and $A(x)$ is the statement that simulating $T$ step for step is the fastest possible algorithm.)
            $endgroup$
            – dkaeae
            2 days ago













            $begingroup$
            I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
            $endgroup$
            – Oren
            2 days ago




            $begingroup$
            I get the logic, but this one example is what I can't see working. can you clarify the roles of $T$ and $T'$ in the algorithm based on $M$'s role?
            $endgroup$
            – Oren
            2 days ago












            $begingroup$
            $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
            $endgroup$
            – dkaeae
            2 days ago




            $begingroup$
            $M = T$, whereas computing $T'$ (or just answering "yes") is a faster algorithm than directly simulating $T$.
            $endgroup$
            – dkaeae
            2 days ago




            1




            1




            $begingroup$
            Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
            $endgroup$
            – Oren
            2 days ago





            $begingroup$
            Though in this example it doesn't check whether $M$ accepts $w$ so it won't always be correct. I mean $T'$ will answer correctly for $T$ and will do it faster than $M$ but for other inputs that are different from $T'$, for example with input of $T''$ that rejects all inputs immediately, it will return a wrong answer, so it is an example for a fast algorithm though is one that isn't always correct
            $endgroup$
            – Oren
            2 days ago


















            draft saved

            draft discarded
















































            Thanks for contributing an answer to Computer Science Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcs.stackexchange.com%2fquestions%2f106329%2ffastest-algorithm-to-decide-whether-a-always-halting-tm-accepts-a-general-stri%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            6Vh,i,9pwu51qxImcx,mw,S PP5IU wTz
            BiwU1Xr9Zajx,YrzCPIcu,PIroj3JXGn E,9FNfD8,SIJ,J79Wt,XydOFnf,oEh4EpwtnXLQKmb dZ

            Popular posts from this blog

            getting Checkpoint VPN SSL Network Extender working in the command lineHow to connect to CheckPoint VPN on Ubuntu 18.04LTS?Will the Linux ( red-hat ) Open VPNC Client connect to checkpoint or nortel VPN gateways?VPN client for linux machine + support checkpoint gatewayVPN SSL Network Extender in FirefoxLinux Checkpoint SNX tool configuration issuesCheck Point - Connect under Linux - snx + OTPSNX VPN Ububuntu 18.XXUsing Checkpoint VPN SSL Network Extender CLI with certificateVPN with network manager (nm-applet) is not workingWill the Linux ( red-hat ) Open VPNC Client connect to checkpoint or nortel VPN gateways?VPN client for linux machine + support checkpoint gatewayImport VPN config files to NetworkManager from command lineTrouble connecting to VPN using network-manager, while command line worksStart a VPN connection with PPTP protocol on command linestarting a docker service daemon breaks the vpn networkCan't connect to vpn with Network-managerVPN SSL Network Extender in FirefoxUsing Checkpoint VPN SSL Network Extender CLI with certificate

            NetworkManager fails with “Could not find source connection”Trouble connecting to VPN using network-manager, while command line worksHow can I be notified about state changes to a VPN adapterBacktrack 5 R3 - Refuses to connect to VPNFeed all traffic through OpenVPN for a specific network namespace onlyRun daemon on startup in Debian once openvpn connection establishedpfsense tcp connection between openvpn and lan is brokenInternet connection problem with web browsers onlyWhy does NetworkManager explicitly support tun/tap devices?Browser issues with VPNTwo IP addresses assigned to the same network card - OpenVPN issues?Cannot connect to WiFi with nmcli, although secrets are provided

            Marilyn Monroe Ny fiainany manokana | Jereo koa | Meny fitetezanafanitarana azy.