22 Search Results for "Pass, Rafael"


Volume

OASIcs, Volume 97

3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)

Tokenomics 2021, November 18-19, 2021, New York University, USA (Virtual Conference)

Editors: Vincent Gramoli, Hanna Halaburda, and Rafael Pass

Document
The Non-Uniform Perebor Conjecture for Time-Bounded Kolmogorov Complexity Is False

Authors: Noam Mazor and Rafael Pass

Published in: LIPIcs, Volume 287, 15th Innovations in Theoretical Computer Science Conference (ITCS 2024)


Abstract
The Perebor (Russian for "brute-force search") conjectures, which date back to the 1950s and 1960s are some of the oldest conjectures in complexity theory. The conjectures are a stronger form of the NP ≠ P conjecture (which they predate) and state that for "meta-complexity" problems, such as the Time-bounded Kolmogorov complexity Problem, and the Minimum Circuit Size Problem, there are no better algorithms than brute force search. In this paper, we disprove the non-uniform version of the Perebor conjecture for the Time-Bounded Kolmogorov complexity problem. We demonstrate that for every polynomial t(⋅), there exists of a circuit of size 2^{4n/5+o(n)} that solves the t(⋅)-bounded Kolmogorov complexity problem on every instance. Our algorithm is black-box in the description of the Universal Turing Machine U employed in the definition of Kolmogorov Complexity and leverages the characterization of one-way functions through the hardness of the time-bounded Kolmogorov complexity problem of Liu and Pass (FOCS'20), and the time-space trade-off for one-way functions of Fiat and Naor (STOC'91). We additionally demonstrate that no such black-box algorithm can have circuit size smaller than 2^{n/2-o(n)}. Along the way (and of independent interest), we extend the result of Fiat and Naor and demonstrate that any efficiently computable function can be inverted (with probability 1) by a circuit of size 2^{4n/5+o(n)}; as far as we know, this yields the first formal proof that a non-trivial circuit can invert any efficient function.

Cite as

Noam Mazor and Rafael Pass. The Non-Uniform Perebor Conjecture for Time-Bounded Kolmogorov Complexity Is False. In 15th Innovations in Theoretical Computer Science Conference (ITCS 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 287, pp. 80:1-80:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{mazor_et_al:LIPIcs.ITCS.2024.80,
  author =	{Mazor, Noam and Pass, Rafael},
  title =	{{The Non-Uniform Perebor Conjecture for Time-Bounded Kolmogorov Complexity Is False}},
  booktitle =	{15th Innovations in Theoretical Computer Science Conference (ITCS 2024)},
  pages =	{80:1--80:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-309-6},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{287},
  editor =	{Guruswami, Venkatesan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITCS.2024.80},
  URN =		{urn:nbn:de:0030-drops-196088},
  doi =		{10.4230/LIPIcs.ITCS.2024.80},
  annote =	{Keywords: Kolmogorov complexity, perebor conjecture, function inversion}
}
Document
Leakage-Resilient Hardness vs Randomness

Authors: Yanyi Liu and Rafael Pass

Published in: LIPIcs, Volume 264, 38th Computational Complexity Conference (CCC 2023)


Abstract
A central open problem in complexity theory concerns the question of whether all efficient randomized algorithms can be simulated by efficient deterministic algorithms. The celebrated "hardness v.s. randomness” paradigm pioneered by Blum-Micali (SIAM JoC’84), Yao (FOCS’84) and Nisan-Wigderson (JCSS’94) presents hardness assumptions under which e.g., prBPP = prP (so-called "high-end derandomization), or prBPP ⊆ prSUBEXP (so-called "low-end derandomization), and more generally, under which prBPP ⊆ prDTIME(𝒞) where 𝒞 is a "nice" class (closed under composition with a polynomial), but these hardness assumptions are not known to also be necessary for such derandomization. In this work, following the recent work by Chen and Tell (FOCS’21) that considers "almost-all-input" hardness of a function f (i.e., hardness of computing f on more than a finite number of inputs), we consider "almost-all-input" leakage-resilient hardness of a function f - that is, hardness of computing f(x) even given, say, √|x| bits of leakage of f(x). We show that leakage-resilient hardness characterizes derandomization of prBPP (i.e., gives a both necessary and sufficient condition for derandomization), both in the high-end and in the low-end setting. In more detail, we show that there exists a constant c such that for every function T, the following are equivalent: - prBPP ⊆ prDTIME(poly(T(poly(n)))); - Existence of a poly(T(poly(n)))-time computable function f :{0,1}ⁿ → {0,1}ⁿ that is almost-all-input leakage-resilient hard with respect to n^c-time probabilistic algorithms. As far as we know, this is the first assumption that characterizes derandomization in both the low-end and the high-end regime. Additionally, our characterization naturally extends also to derandomization of prMA, and also to average-case derandomization, by appropriately weakening the requirements on the function f. In particular, for the case of average-case (a.k.a. "effective") derandomization, we no longer require the function to be almost-all-input hard, but simply satisfy the more standard notion of average-case leakage-resilient hardness (w.r.t., every samplable distribution), whereas for derandomization of prMA, we instead consider leakage-resilience for relations.

Cite as

Yanyi Liu and Rafael Pass. Leakage-Resilient Hardness vs Randomness. In 38th Computational Complexity Conference (CCC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 264, pp. 32:1-32:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.CCC.2023.32,
  author =	{Liu, Yanyi and Pass, Rafael},
  title =	{{Leakage-Resilient Hardness vs Randomness}},
  booktitle =	{38th Computational Complexity Conference (CCC 2023)},
  pages =	{32:1--32:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-282-2},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{264},
  editor =	{Ta-Shma, Amnon},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2023.32},
  URN =		{urn:nbn:de:0030-drops-183022},
  doi =		{10.4230/LIPIcs.CCC.2023.32},
  annote =	{Keywords: Derandomization, Leakage-Resilient Hardness}
}
Document
Characterizing Derandomization Through Hardness of Levin-Kolmogorov Complexity

Authors: Yanyi Liu and Rafael Pass

Published in: LIPIcs, Volume 234, 37th Computational Complexity Conference (CCC 2022)


Abstract
A central open problem in complexity theory concerns the question of whether all efficient randomized algorithms can be simulated by efficient deterministic algorithms. We consider this problem in the context of promise problems (i.e,. the prBPP v.s. prP problem) and show that for all sufficiently large constants c, the following are equivalent: - prBPP = prP. - For every BPTIME(n^c) algorithm M, and every sufficiently long z ∈ {0,1}ⁿ, there exists some x ∈ {0,1}ⁿ such that M fails to decide whether Kt(x∣z) is "very large" (≥ n-1) or "very small" (≤ O(log n)). where Kt(x∣z) denotes the Levin-Kolmogorov complexity of x conditioned on z. As far as we are aware, this yields the first full characterization of when prBPP = prP through the hardness of some class of problems. Previous hardness assumptions used for derandomization only provide a one-sided implication.

Cite as

Yanyi Liu and Rafael Pass. Characterizing Derandomization Through Hardness of Levin-Kolmogorov Complexity. In 37th Computational Complexity Conference (CCC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 234, pp. 35:1-35:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.CCC.2022.35,
  author =	{Liu, Yanyi and Pass, Rafael},
  title =	{{Characterizing Derandomization Through Hardness of Levin-Kolmogorov Complexity}},
  booktitle =	{37th Computational Complexity Conference (CCC 2022)},
  pages =	{35:1--35:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-241-9},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{234},
  editor =	{Lovett, Shachar},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2022.35},
  URN =		{urn:nbn:de:0030-drops-165975},
  doi =		{10.4230/LIPIcs.CCC.2022.35},
  annote =	{Keywords: Derandomization, Kolmogorov Complexity, Hitting Set Generators}
}
Document
On One-Way Functions from NP-Complete Problems

Authors: Yanyi Liu and Rafael Pass

Published in: LIPIcs, Volume 234, 37th Computational Complexity Conference (CCC 2022)


Abstract
We present the first natural NP-complete problem whose average-case hardness w.r.t. the uniform distribution over instances is equivalent to the existence of one-way functions (OWFs). The problem, which originated in the 1960s, is the Conditional Time-Bounded Kolmogorov Complexity Problem: let K^t(x∣z) be the length of the shortest "program" that, given the "auxiliary input" z, outputs the string x within time t(|x|), and let McK^tP[ζ] be the set of strings (x,z,k) where |z| = ζ(|x|), |k| = log |x| and K^t(x∣z) < k, where, for our purposes, a "program" is defined as a RAM machine. Our main result shows that for every polynomial t(n) ≥ n², there exists some polynomial ζ such that McK^tP[ζ] is NP-complete. We additionally extend the result of Liu-Pass (FOCS'20) to show that for every polynomial t(n) ≥ 1.1n, and every polynomial ζ(⋅), mild average-case hardness of McK^tP[ζ] is equivalent to the existence of OWFs. Taken together, these results provide the following crisp characterization of what is required to base OWFs on NP ⊈ BPP: There exists concrete polynomials t,ζ such that "Basing OWFs on NP ⊈ BPP" is equivalent to providing a "worst-case to (mild) average-case reduction for McK^tP[ζ]". In other words, the "holy-grail" of Cryptography (i.e., basing OWFs on NP ⊈ BPP) is equivalent to a basic question in algorithmic information theory. As an independent contribution, we show that our NP-completeness result can be used to shed new light on the feasibility of the polynomial-time bounded symmetry of information assertion (Kolmogorov'68).

Cite as

Yanyi Liu and Rafael Pass. On One-Way Functions from NP-Complete Problems. In 37th Computational Complexity Conference (CCC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 234, pp. 36:1-36:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.CCC.2022.36,
  author =	{Liu, Yanyi and Pass, Rafael},
  title =	{{On One-Way Functions from NP-Complete Problems}},
  booktitle =	{37th Computational Complexity Conference (CCC 2022)},
  pages =	{36:1--36:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-241-9},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{234},
  editor =	{Lovett, Shachar},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2022.36},
  URN =		{urn:nbn:de:0030-drops-165981},
  doi =		{10.4230/LIPIcs.CCC.2022.36},
  annote =	{Keywords: One-way Functions, NP-Completeness, Kolmogorov Complexity}
}
Document
Complete Volume
OASIcs, Volume 97, Tokenomics 2021, Complete Volume

Authors: Vincent Gramoli, Hanna Halaburda, and Rafael Pass

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
OASIcs, Volume 97, Tokenomics 2021, Complete Volume

Cite as

3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 1-124, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Proceedings{gramoli_et_al:OASIcs.Tokenomics.2021,
  title =	{{OASIcs, Volume 97, Tokenomics 2021, Complete Volume}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{1--124},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021},
  URN =		{urn:nbn:de:0030-drops-158965},
  doi =		{10.4230/OASIcs.Tokenomics.2021},
  annote =	{Keywords: OASIcs, Volume 97, Tokenomics 2021, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Vincent Gramoli, Hanna Halaburda, and Rafael Pass

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 0:i-0:x, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{gramoli_et_al:OASIcs.Tokenomics.2021.0,
  author =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{0:i--0:x},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.0},
  URN =		{urn:nbn:de:0030-drops-158975},
  doi =		{10.4230/OASIcs.Tokenomics.2021.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Distributed Computing Meets Game Theory: Fault Tolerance and Implementation with Cheap Talk (Invited Talk)

Authors: Joseph Y. Halpern

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
Traditionally, work in distributed computing has divided the agents into "good guys" and "bad guys". The good guys follow the protocol; the bad guys do everything in their power to make sure it does not work. By way of contrast, game theory has focused on "rational" agents, who try to maximize their utilities. Here I try to combine these viewpoints. Specifically, following the work of Abraham et al. [I. Abraham et al., 2006], I consider (k,t)-robust protocols/strategies, which tolerate coalitions of rational players of size up to k and up to t malicious players. I focus in particular on the problem that economists have called implementing a mediator. That is, can the players in the system, just talking among themselves (using what economists call "cheap talk") simulate the effects of the mediator (see, e.g., [I. Barany, 1992; E. Ben-Porath, 2003; Forges, 1990; D. Gerardi, 2004; Y. Heller, 2005; A. Urbano and J. E. Vila, 2002; A. Urbano and J. E. Vila, 2004]). In computer science, this essentially amounts to multiparty computation [O. Goldreich et al., 1987; A. Shamir et al., 1981; A. Yao, 1982]. Ideas from cryptography and distributed computing allow us to prove results on how many agents are required to implement a (k,t)-robust mediator just using cheap talk. These results subsume (and, in some cases, correct) results from the game theory literature. The results of Abraham et al. [I. Abraham et al., 2006] were proved for what are called synchronous systems in the distributed computing community; this is also the case for all the results in the economics literature cited above. In synchronous systems, communication proceeds in atomic rounds, and all messages sent during round r are received by round r + 1. But many systems in the real world are asynchronous. In an asynchronous setting, there are no rounds; messages sent by the players may take arbitrarily long to get to their recipients. Markets and the internet are best viewed as asynchronous. Blockchain implementations assume partial synchrony, where there is an upper bound on how long messages take to arrive. The partial synchronous setting already shows some of the difficulty of moving away from synchrony: An agent i can wait to take its action until it receives a message from j (on which its action can depend). This cannot happen in a synchronous setting. Abraham, Dolev, Geffner, abnd Halpern [I. Abraham et al., 2019] extend the results on implementing mediators to the asynchronous setting.

Cite as

Joseph Y. Halpern. Distributed Computing Meets Game Theory: Fault Tolerance and Implementation with Cheap Talk (Invited Talk). In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 1:1-1:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{halpern:OASIcs.Tokenomics.2021.1,
  author =	{Halpern, Joseph Y.},
  title =	{{Distributed Computing Meets Game Theory: Fault Tolerance and Implementation with Cheap Talk}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{1:1--1:2},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.1},
  URN =		{urn:nbn:de:0030-drops-158981},
  doi =		{10.4230/OASIcs.Tokenomics.2021.1},
  annote =	{Keywords: robust equilibrium, implementing mediators, asynchronous systems}
}
Document
General Congestion Attack on HTLC-Based Payment Channel Networks

Authors: Zhichun Lu, Runchao Han, and Jiangshan Yu

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
Payment Channel Networks (PCNs) have been a promising approach to scale blockchains. However, PCNs have limited liquidity: large-amount or multi-hop payments may fail. The major threat of PCNs liquidity is payment griefing, where the adversary who acts as the payee keeps withholding the payment, so that coins involved in the payment cannot be used for routing other payments before the payment expires. Payment griefing gives adversaries a chance to launch the congestion attack, where the adversary griefs a large number of payments and paralyses the entire PCN. Understanding congestion attacks, including their strategies and impact, is crucial for designing PCNs with better liquidity guarantees. However, existing research has only focused on the specific attacking strategies and specific aspects of their impact on PCNs. We fill this gap by studying the general congestion attack. Compared to existing attack strategies, in our framework each step serves an orthogonal purpose and is customisable, allowing the adversary to focus on different aspects of the liquidity. To evaluate the attack’s impact, we propose a generic method of quantifying PCNs' liquidity and effectiveness of the congestion attacks. We evaluate our general congestion attacks on Bitcoin’s Lightning Network, and show that with direct channels to 1.5% richest nodes, and ∼ 0.0096 BTC of cost, the adversary can launch a congestion attack that locks 47% (∼280 BTC) coins in the network; reduces success rate of payments by 16.0%∼60.0%; increases fee of payments by 4.5%∼16.0%; increases average attempts of payments by 42.0%∼115.3%; and increase the number of bankruptcy nodes (i.e., nodes with insufficient balance for making normal-size payments) by 26.6%∼109.4%, where the amounts of payments range from 0.001 to 0.019 BTC.

Cite as

Zhichun Lu, Runchao Han, and Jiangshan Yu. General Congestion Attack on HTLC-Based Payment Channel Networks. In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 2:1-2:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{lu_et_al:OASIcs.Tokenomics.2021.2,
  author =	{Lu, Zhichun and Han, Runchao and Yu, Jiangshan},
  title =	{{General Congestion Attack on HTLC-Based Payment Channel Networks}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{2:1--2:15},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.2},
  URN =		{urn:nbn:de:0030-drops-158990},
  doi =		{10.4230/OASIcs.Tokenomics.2021.2},
  annote =	{Keywords: Blockchain, PCN, Congestion}
}
Document
Tuning PoW with Hybrid Expenditure

Authors: Itay Tsabary, Alexander Spiegelman, and Ittay Eyal

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
Proof of Work (PoW) is a Sybil-deterrence security mechanism. It introduces an external cost to system participation by requiring computational effort to perform actions. However, since its inception, a central challenge was to tune this cost. Initial designs for deterring spam email and DoS attacks applied overhead equally to honest participants and attackers. Requiring too little effort does not deter attacks, whereas too much encumbers honest participation. This might be the reason it was never widely adopted. Nakamoto overcame this trade-off in Bitcoin by distinguishing desired from malicious behavior and introducing internal rewards for the former. This mechanism gained popularity in securing permissionless cryptocurrencies, using virtual internally-minted tokens for rewards. However, in existing blockchain protocols the internal rewards directly compensate users for (almost) the same value of external expenses. Thus, as the token value soars, so does the PoW expenditure. Bitcoin PoW, for example, already expends as much electricity as Colombia or Switzerland. This amount of resource-guzzling is unsustainable, and hinders even wider adoption of these systems. As such, a prominent alternative named Proof of Stake (PoS) replaces the expenditure requirement with token possession. However, PoS is shun by many cryptocurrency projects, as it is only secure under qualitatively-different assumptions, and the resultant systems are not permissionless. In this work we present Hybrid Expenditure Blockchain (HEB), a novel PoW mechanism. HEB is a generalization of Nakamoto’s protocol that enables tuning the external expenditure by introducing a complementary internal-expenditure mechanism. Thus, for the first time, HEB decouples external expenditure from the reward value. We show a practical parameter choice by which HEB requires significantly less external consumption compare to Nakamoto’s protocol, its resilience against rational attackers is similar, and it retains the decentralized and permissionless nature of the system. Taking the Bitcoin ecosystem as an example, HEB cuts the electricity consumption by half.

Cite as

Itay Tsabary, Alexander Spiegelman, and Ittay Eyal. Tuning PoW with Hybrid Expenditure. In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 3:1-3:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{tsabary_et_al:OASIcs.Tokenomics.2021.3,
  author =	{Tsabary, Itay and Spiegelman, Alexander and Eyal, Ittay},
  title =	{{Tuning PoW with Hybrid Expenditure}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{3:1--3:17},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.3},
  URN =		{urn:nbn:de:0030-drops-159008},
  doi =		{10.4230/OASIcs.Tokenomics.2021.3},
  annote =	{Keywords: Blockchain, Proof of work, Cryptocurrency, Environmental impact}
}
Document
On Cryptocurrency Wallet Design

Authors: Ittay Eyal

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
The security of cryptocurrency and decentralized blockchain-maintained assets relies on their owners safeguarding secrets, typically cryptographic keys. This applies equally to individuals keeping daily-spending amounts and to large asset management companies. Loss of keys and attackers gaining control of keys resulted in numerous losses of funds. The security of individual keys was widely studied with practical solutions available, from mnemonic phrases to dedicated hardware. There are also techniques for securing funds by requiring combinations of multiple keys. However, to the best of our knowledge, a crucial question was never addressed: How is wallet security affected by the number of keys, their types, and how they are combined? This is the focus of this work. We present a model where each key has certain probabilities for being safe, lost, leaked, or stolen (available only to an attacker). The number of possible wallets for a given number of keys is the Dedekind number, prohibiting an exhaustive search with many keys. Nonetheless, we bound optimal-wallet failure probabilities with an evolutionary algorithm. We evaluate the security (complement of failure probability) of wallets based on the number and types of keys used. Our analysis covers a wide range of settings and reveals several surprises. The failure probability general trend drops exponentially with the number of keys, but has a strong dependency on its parity. In many cases, but not always, heterogeneous keys (not all with the same fault probabilities) allow for superior wallets than homogeneous keys. Nonetheless, in the case of 3 keys, the common practice of requiring any pair is optimal in many settings. Our formulation of the problem and initial results reveal several open questions, from user studies of key fault probabilities to finding optimal wallets with very large numbers of keys. But they also have an immediate practical outcome, informing cryptocurrency users on optimal wallet design.

Cite as

Ittay Eyal. On Cryptocurrency Wallet Design. In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 4:1-4:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{eyal:OASIcs.Tokenomics.2021.4,
  author =	{Eyal, Ittay},
  title =	{{On Cryptocurrency Wallet Design}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{4:1--4:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.4},
  URN =		{urn:nbn:de:0030-drops-159012},
  doi =		{10.4230/OASIcs.Tokenomics.2021.4},
  annote =	{Keywords: cryptocurrency, wallet, key-management, authentication}
}
Document
Secure Computation with Non-Equivalent Penalties in Constant Rounds

Authors: Takeshi Nakai and Kazumasa Shinagawa

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
It is known that Bitcoin enables to achieve fairness in secure computation by imposing a monetary penalty on adversarial parties. This functionality is called secure computation with penalties. Bentov and Kumaresan (Crypto 2014) showed that it could be realized with O(n) rounds and O(n) broadcasts for any function, where n is the number of parties. Kumaresan and Bentov (CCS 2014) posed an open question: "Is it possible to design secure computation with penalties that needs only O(1) rounds and O(n) broadcasts?" In this work, we introduce secure computation with non-equivalent penalties, and design a protocol achieving this functionality with O(1) rounds and O(n) broadcasts only. The new functionality is the same as secure computation with penalties except that every honest party receives more than a predetermined amount of compensation while the previous one requires that every honest party receives the same amount of compensation. In particular, both are the same if all parties behave honestly. Thus, our result gives a partial answer to the open problem with a slight and natural modification of functionality.

Cite as

Takeshi Nakai and Kazumasa Shinagawa. Secure Computation with Non-Equivalent Penalties in Constant Rounds. In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, pp. 5:1-5:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{nakai_et_al:OASIcs.Tokenomics.2021.5,
  author =	{Nakai, Takeshi and Shinagawa, Kazumasa},
  title =	{{Secure Computation with Non-Equivalent Penalties in Constant Rounds}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{5:1--5:16},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.5},
  URN =		{urn:nbn:de:0030-drops-159026},
  doi =		{10.4230/OASIcs.Tokenomics.2021.5},
  annote =	{Keywords: Secure computation, Fairness, Bitcoin}
}
Document
Invited Talk
Dynamic Posted-Price Mechanisms for the Blockchain Transaction Fee Market (Invited Talk)

Authors: Matheus V. X. Ferreira, Daniel J. Moroz, David C. Parkes, and Mitchell Stern

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
In recent years, prominent blockchain systems such as Bitcoin and Ethereum have experienced explosive growth in transaction volume, leading to frequent surges in demand for limited block space and causing transaction fees to fluctuate by orders of magnitude. Existing systems sell space using first-price auctions; however, users find it difficult to estimate how much they need to bid in order to get their transactions accepted onto the chain. If they bid too low, their transactions can have long confirmation times. If they bid too high, they pay larger fees than necessary. In light of these issues, new transaction fee mechanisms have been proposed, most notably EIP-1559, aiming to provide better usability. EIP-1559 is a history-dependent mechanism that relies on block utilization to adjust a base fee. We propose an alternative design - a dynamic posted-price mechanism - which uses not only block utilization but also observable bids from past blocks to compute a posted price for subsequent blocks. We show its potential to reduce price volatility by providing examples for which the prices of EIP-1559 are unstable while the prices of the proposed mechanism are stable. More generally, whenever the demand for the blockchain stabilizes, we ask if our mechanism is able to converge to a stable state. Our main result provides sufficient conditions in a probabilistic setting for which the proposed mechanism is approximately welfare optimal and the prices are stable. Our main technical contribution towards establishing stability is an iterative algorithm that, given oracle access to a Lipschitz continuous and strictly concave function f, converges to a fixed point of f.

Cite as

Matheus V. X. Ferreira, Daniel J. Moroz, David C. Parkes, and Mitchell Stern. Dynamic Posted-Price Mechanisms for the Blockchain Transaction Fee Market (Invited Talk). In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, p. 6:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{ferreira_et_al:OASIcs.Tokenomics.2021.6,
  author =	{Ferreira, Matheus V. X. and Moroz, Daniel J. and Parkes, David C. and Stern, Mitchell},
  title =	{{Dynamic Posted-Price Mechanisms for the Blockchain Transaction Fee Market}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{6:1--6:1},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.6},
  URN =		{urn:nbn:de:0030-drops-159039},
  doi =		{10.4230/OASIcs.Tokenomics.2021.6},
  annote =	{Keywords: Blockchain, Posted-price mechanism, Credible, Incentive compatibility, Transaction fee market, first-price auction, EIP-1559}
}
Document
Extended Abstract
Best Before? Expiring Central Bank Digital Currency and Loss Recovery (Extended Abstract)

Authors: Charles M. Kahn, Maarten R.C. van Oordt, and Yu Zhu

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
An important feature of physical cash payments is resilience, due to their independence of power outages or network coverage. Many central banks are exploring issuing digital cash substitutes with similar offline payment functionality. Such substitutes could incorporate novel features making them more desirable than physical cash. This paper considers introducing an expiry date for offline digital currency balances to automate personal loss recovery. We show this functionality could increase consumer demand for digital cash, with the time to expiration playing a key role. The optimal time to expiration should have short wait time to get lost money reimbursed but leave enough time for agents to deposit received offline balances. Setting the time to expiration a bit too long has minor impact on welfare but setting it too short dramatically reduce welfare. If the offline device provides information about past transactions to the central bank, the accuracy of loss recovery can be improved but welfare can decrease.

Cite as

Charles M. Kahn, Maarten R.C. van Oordt, and Yu Zhu. Best Before? Expiring Central Bank Digital Currency and Loss Recovery (Extended Abstract). In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, p. 7:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{kahn_et_al:OASIcs.Tokenomics.2021.7,
  author =	{Kahn, Charles M. and van Oordt, Maarten R.C. and Zhu, Yu},
  title =	{{Best Before? Expiring Central Bank Digital Currency and Loss Recovery}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{7:1--7:1},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.7},
  URN =		{urn:nbn:de:0030-drops-159040},
  doi =		{10.4230/OASIcs.Tokenomics.2021.7},
  annote =	{Keywords: Central Bank Digital Currency, Design, Offline Payments, Operational Resilience, Financial Inclusion}
}
Document
Optimal Design of Tokenized Markets

Authors: Michael Junho Lee, Antoine Martin, and Robert M. Townsend

Published in: OASIcs, Volume 97, 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)


Abstract
Trades in today’s financial system are inherently subject to settlement uncertainty. This paper explores tokenization as a potential technological solution. A token system, by enabling programmability of assets, can be designed to eradicate settlement uncertainty. We study the allocations achieved in a decentralized market with either the legacy settlement system or a token system. Tokenization can improve efficiency in markets subject to a limited commitment problem. However, it also materially alters the information environment, which in turn aggravates a hold-up problem. This limits potential gains from resolving settlement uncertainty, particularly for markets that depend on intermediaries.

Cite as

Michael Junho Lee, Antoine Martin, and Robert M. Townsend. Optimal Design of Tokenized Markets. In 3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021). Open Access Series in Informatics (OASIcs), Volume 97, p. 8:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{lee_et_al:OASIcs.Tokenomics.2021.8,
  author =	{Lee, Michael Junho and Martin, Antoine and Townsend, Robert M.},
  title =	{{Optimal Design of Tokenized Markets}},
  booktitle =	{3rd International Conference on Blockchain Economics, Security and Protocols (Tokenomics 2021)},
  pages =	{8:1--8:1},
  series =	{Open Access Series in Informatics (OASIcs)},
  ISBN =	{978-3-95977-220-4},
  ISSN =	{2190-6807},
  year =	{2022},
  volume =	{97},
  editor =	{Gramoli, Vincent and Halaburda, Hanna and Pass, Rafael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/OASIcs.Tokenomics.2021.8},
  URN =		{urn:nbn:de:0030-drops-159051},
  doi =		{10.4230/OASIcs.Tokenomics.2021.8},
  annote =	{Keywords: Tokenization, programmability, settlement uncertainty, asymmetric information}
}
  • Refine by Author
  • 7 Pass, Rafael
  • 3 Liu, Yanyi
  • 2 Eyal, Ittay
  • 2 Gramoli, Vincent
  • 2 Halaburda, Hanna
  • Show More...

  • Refine by Classification
  • 4 Security and privacy → Cryptography
  • 2 Mathematics of computing → Mathematical analysis
  • 2 Security and privacy → Distributed systems security
  • 2 Theory of computation → Computational complexity and cryptography
  • 2 Theory of computation → Distributed algorithms
  • Show More...

  • Refine by Keyword
  • 5 Blockchain
  • 4 Cryptocurrency
  • 2 Bitcoin
  • 2 Derandomization
  • 2 Kolmogorov Complexity
  • Show More...

  • Refine by Type
  • 21 document
  • 1 volume

  • Refine by Publication Year
  • 18 2022
  • 1 2017
  • 1 2021
  • 1 2023
  • 1 2024

Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail