User Info - Kevlar Lets Talk Bitcoin

Comparison between Avalanche, Cosmos and Polkadot

Comparison between Avalanche, Cosmos and Polkadot
Reposting after was mistakenly removed by mods (since resolved - Thanks)
A frequent question I see being asked is how Cosmos, Polkadot and Avalanche compare? Whilst there are similarities there are also a lot of differences. This article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important.
For better formatting see https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b
https://preview.redd.it/e8s7dj3ivpq51.png?width=428&format=png&auto=webp&s=5d0463462702637118c7527ebf96e91f4a80b290

Overview

Cosmos

Cosmos is a heterogeneous network of many independent parallel blockchains, each powered by classical BFT consensus algorithms like Tendermint. Developers can easily build custom application specific blockchains, called Zones, through the Cosmos SDK framework. These Zones connect to Hubs, which are specifically designed to connect zones together.
The vision of Cosmos is to have thousands of Zones and Hubs that are Interoperable through the Inter-Blockchain Communication Protocol (IBC). Cosmos can also connect to other systems through peg zones, which are specifically designed zones that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. Cosmos does not use Sharding with each Zone and Hub being sovereign with their own validator set.
For a more in-depth look at Cosmos and provide more reference to points made in this article, please see my three part series — Part One, Part Two, Part Three
(There's a youtube video with a quick video overview of Cosmos on the medium article - https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b)

Polkadot

Polkadot is a heterogeneous blockchain protocol that connects multiple specialised blockchains into one unified network. It achieves scalability through a sharding infrastructure with multiple blockchains running in parallel, called parachains, that connect to a central chain called the Relay Chain. Developers can easily build custom application specific parachains through the Substrate development framework.
The relay chain validates the state transition of connected parachains, providing shared state across the entire ecosystem. If the Relay Chain must revert for any reason, then all of the parachains would also revert. This is to ensure that the validity of the entire system can persist, and no individual part is corruptible. The shared state makes it so that the trust assumptions when using parachains are only those of the Relay Chain validator set, and no other. Interoperability is enabled between parachains through Cross-Chain Message Passing (XCMP) protocol and is also possible to connect to other systems through bridges, which are specifically designed parachains or parathreads that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. The hope is to have 100 parachains connect to the relay chain.
For a more in-depth look at Polkadot and provide more reference to points made in this article, please see my three part series — Part One, Part Two, Part Three
(There's a youtube video with a quick video overview of Polkadot on the medium article - https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b)

Avalanche

Avalanche is a platform of platforms, ultimately consisting of thousands of subnets to form a heterogeneous interoperable network of many blockchains, that takes advantage of the revolutionary Avalanche Consensus protocols to provide a secure, globally distributed, interoperable and trustless framework offering unprecedented decentralisation whilst being able to comply with regulatory requirements.
Avalanche allows anyone to create their own tailor-made application specific blockchains, supporting multiple custom virtual machines such as EVM and WASM and written in popular languages like Go (with others coming in the future) rather than lightly used, poorly-understood languages like Solidity. This virtual machine can then be deployed on a custom blockchain network, called a subnet, which consist of a dynamic set of validators working together to achieve consensus on the state of a set of many blockchains where complex rulesets can be configured to meet regulatory compliance.
Avalanche was built with serving financial markets in mind. It has native support for easily creating and trading digital smart assets with complex custom rule sets that define how the asset is handled and traded to ensure regulatory compliance can be met. Interoperability is enabled between blockchains within a subnet as well as between subnets. Like Cosmos and Polkadot, Avalanche is also able to connect to other systems through bridges, through custom virtual machines made to interact with another ecosystem such as Ethereum and Bitcoin.
For a more in-depth look at Avalanche and provide more reference to points made in this article, please see here and here
(There's a youtube video with a quick video overview of Avalanche on the medium article - https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b)

Comparison between Cosmos, Polkadot and Avalanche

A frequent question I see being asked is how Cosmos, Polkadot and Avalanche compare? Whilst there are similarities there are also a lot of differences. This article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important. For a more in-depth view I recommend reading the articles for each of the projects linked above and coming to your own conclusions. I want to stress that it’s not a case of one platform being the killer of all other platforms, far from it. There won’t be one platform to rule them all, and too often the tribalism has plagued this space. Blockchains are going to completely revolutionise most industries and have a profound effect on the world we know today. It’s still very early in this space with most adoption limited to speculation and trading mainly due to the limitations of Blockchain and current iteration of Ethereum, which all three of these platforms hope to address. For those who just want a quick summary see the image at the bottom of the article. With that said let’s have a look

Scalability

Cosmos

Each Zone and Hub in Cosmos is capable of up to around 1000 transactions per second with bandwidth being the bottleneck in consensus. Cosmos aims to have thousands of Zones and Hubs all connected through IBC. There is no limit on the number of Zones / Hubs that can be created

Polkadot

Parachains in Polkadot are also capable of up to around 1500 transactions per second. A portion of the parachain slots on the Relay Chain will be designated as part of the parathread pool, the performance of a parachain is split between many parathreads offering lower performance and compete amongst themselves in a per-block auction to have their transactions included in the next relay chain block. The number of parachains is limited by the number of validators on the relay chain, they hope to be able to achieve 100 parachains.

Avalanche

Avalanche is capable of around 4500 transactions per second per subnet, this is based on modest hardware requirements to ensure maximum decentralisation of just 2 CPU cores and 4 GB of Memory and with a validator size of over 2,000 nodes. Performance is CPU-bound and if higher performance is required then more specialised subnets can be created with higher minimum requirements to be able to achieve 10,000 tps+ in a subnet. Avalanche aims to have thousands of subnets (each with multiple virtual machines / blockchains) all interoperable with each other. There is no limit on the number of Subnets that can be created.

Results

All three platforms offer vastly superior performance to the likes of Bitcoin and Ethereum 1.0. Avalanche with its higher transactions per second, no limit on the number of subnets / blockchains that can be created and the consensus can scale to potentially millions of validators all participating in consensus scores ✅✅✅. Polkadot claims to offer more tps than cosmos, but is limited to the number of parachains (around 100) whereas with Cosmos there is no limit on the number of hubs / zones that can be created. Cosmos is limited to a fairly small validator size of around 200 before performance degrades whereas Polkadot hopes to be able to reach 1000 validators in the relay chain (albeit only a small number of validators are assigned to each parachain). Thus Cosmos and Polkadot scores ✅✅
https://preview.redd.it/2o0brllyvpq51.png?width=1000&format=png&auto=webp&s=8f62bb696ecaafcf6184da005d5fe0129d504518

Decentralisation

Cosmos

Tendermint consensus is limited to around 200 validators before performance starts to degrade. Whilst there is the Cosmos Hub it is one of many hubs in the network and there is no central hub or limit on the number of zones / hubs that can be created.

Polkadot

Polkadot has 1000 validators in the relay chain and these are split up into a small number that validate each parachain (minimum of 14). The relay chain is a central point of failure as all parachains connect to it and the number of parachains is limited depending on the number of validators (they hope to achieve 100 parachains). Due to the limited number of parachain slots available, significant sums of DOT will need to be purchased to win an auction to lease the slot for up to 24 months at a time. Thus likely to lead to only those with enough funds to secure a parachain slot. Parathreads are however an alternative for those that require less and more varied performance for those that can’t secure a parachain slot.

Avalanche

Avalanche consensus scan scale to tens of thousands of validators, even potentially millions of validators all participating in consensus through repeated sub-sampling. The more validators, the faster the network becomes as the load is split between them. There are modest hardware requirements so anyone can run a node and there is no limit on the number of subnets / virtual machines that can be created.

Results

Avalanche offers unparalleled decentralisation using its revolutionary consensus protocols that can scale to millions of validators all participating in consensus at the same time. There is no limit to the number of subnets and virtual machines that can be created, and they can be created by anyone for a small fee, it scores ✅✅✅. Cosmos is limited to 200 validators but no limit on the number of zones / hubs that can be created, which anyone can create and scores ✅✅. Polkadot hopes to accommodate 1000 validators in the relay chain (albeit these are split amongst each of the parachains). The number of parachains is limited and maybe cost prohibitive for many and the relay chain is a ultimately a single point of failure. Whilst definitely not saying it’s centralised and it is more decentralised than many others, just in comparison between the three, it scores ✅
https://preview.redd.it/ckfamee0wpq51.png?width=1000&format=png&auto=webp&s=c4355f145d821fabf7785e238dbc96a5f5ce2846

Latency

Cosmos

Tendermint consensus used in Cosmos reaches finality within 6 seconds. Cosmos consists of many Zones and Hubs that connect to each other. Communication between 2 zones could pass through many hubs along the way, thus also can contribute to latency times depending on the path taken as explained in part two of the articles on Cosmos. It doesn’t need to wait for an extended period of time with risk of rollbacks.

Polkadot

Polkadot provides a Hybrid consensus protocol consisting of Block producing protocol, BABE, and then a finality gadget called GRANDPA that works to agree on a chain, out of many possible forks, by following some simpler fork choice rule. Rather than voting on every block, instead it reaches agreements on chains. As soon as more than 2/3 of validators attest to a chain containing a certain block, all blocks leading up to that one are finalized at once.
If an invalid block is detected after it has been finalised then the relay chain would need to be reverted along with every parachain. This is particularly important when connecting to external blockchains as those don’t share the state of the relay chain and thus can’t be rolled back. The longer the time period, the more secure the network is, as there is more time for additional checks to be performed and reported but at the expense of finality. Finality is reached within 60 seconds between parachains but for external ecosystems like Ethereum their state obviously can’t be rolled back like a parachain and so finality will need to be much longer (60 minutes was suggested in the whitepaper) and discussed in more detail in part three

Avalanche

Avalanche consensus achieves finality within 3 seconds, with most happening sub 1 second, immutable and completely irreversible. Any subnet can connect directly to another without having to go through multiple hops and any VM can talk to another VM within the same subnet as well as external subnets. It doesn’t need to wait for an extended period of time with risk of rollbacks.

Results

With regards to performance far too much emphasis is just put on tps as a metric, the other equally important metric, if not more important with regards to finance is latency. Throughput measures the amount of data at any given time that it can handle whereas latency is the amount of time it takes to perform an action. It’s pointless saying you can process more transactions per second than VISA when it takes 60 seconds for a transaction to complete. Low latency also greatly increases general usability and customer satisfaction, nowadays everyone expects card payments, online payments to happen instantly. Avalanche achieves the best results scoring ✅✅✅, Cosmos with comes in second with 6 second finality ✅✅ and Polkadot with 60 second finality (which may be 60 minutes for external blockchains) scores ✅
https://preview.redd.it/kzup5x42wpq51.png?width=1000&format=png&auto=webp&s=320eb4c25dc4fc0f443a7a2f7ff09567871648cd

Shared Security

Cosmos

Every Zone and Hub in Cosmos has their own validator set and different trust assumptions. Cosmos are researching a shared security model where a Hub can validate the state of connected zones for a fee but not released yet. Once available this will make shared security optional rather than mandatory.

Polkadot

Shared Security is mandatory with Polkadot which uses a Shared State infrastructure between the Relay Chain and all of the connected parachains. If the Relay Chain must revert for any reason, then all of the parachains would also revert. Every parachain makes the same trust assumptions, and as such the relay chain validates state transition and enables seamless interoperability between them. In return for this benefit, they have to purchase DOT and win an auction for one of the available parachain slots.
However, parachains can’t just rely on the relay chain for their security, they will also need to implement censorship resistance measures and utilise proof of work / proof of stake for each parachain as well as discussed in part three, thus parachains can’t just rely on the security of the relay chain, they need to ensure sybil resistance mechanisms using POW and POS are implemented on the parachain as well.

Avalanche

A subnet in Avalanche consists of a dynamic set of validators working together to achieve consensus on the state of a set of many blockchains where complex rulesets can be configured to meet regulatory compliance. So unlike in Cosmos where each zone / hub has their own validators, A subnet can validate a single or many virtual machines / blockchains with a single validator set. Shared security is optional

Results

Shared security is mandatory in polkadot and a key design decision in its infrastructure. The relay chain validates the state transition of all connected parachains and thus scores ✅✅✅. Subnets in Avalanche can validate state of either a single or many virtual machines. Each subnet can have their own token and shares a validator set, where complex rulesets can be configured to meet regulatory compliance. It scores ✅ ✅. Every Zone and Hub in cosmos has their own validator set / token but research is underway to have the hub validate the state transition of connected zones, but as this is still early in the research phase scores ✅ for now.
https://preview.redd.it/pbgyk3o3wpq51.png?width=1000&format=png&auto=webp&s=61c18e12932a250f5633c40633810d0f64520575

Current Adoption

Cosmos

The Cosmos project started in 2016 with an ICO held in April 2017. There are currently around 50 projects building on the Cosmos SDK with a full list can be seen here and filtering for Cosmos SDK . Not all of the projects will necessarily connect using native cosmos sdk and IBC and some have forked parts of the Cosmos SDK and utilise the tendermint consensus such as Binance Chain but have said they will connect in the future.

Polkadot

The Polkadot project started in 2016 with an ICO held in October 2017. There are currently around 70 projects building on Substrate and a full list can be seen here and filtering for Substrate Based. Like with Cosmos not all projects built using substrate will necessarily connect to Polkadot and parachains or parathreads aren’t currently implemented in either the Live or Test network (Kusama) as of the time of this writing.

Avalanche

Avalanche in comparison started much later with Ava Labs being founded in 2018. Avalanche held it’s ICO in July 2020. Due to lot shorter time it has been in development, the number of projects confirmed are smaller with around 14 projects currently building on Avalanche. Due to the customisability of the platform though, many virtual machines can be used within a subnet making the process incredibly easy to port projects over. As an example, it will launch with the Ethereum Virtual Machine which enables byte for byte compatibility and all the tooling like Metamask, Truffle etc. will work, so projects can easily move over to benefit from the performance, decentralisation and low gas fees offered. In the future Cosmos and Substrate virtual machines could be implemented on Avalanche.

Results

Whilst it’s still early for all 3 projects (and the entire blockchain space as a whole), there is currently more projects confirmed to be building on Cosmos and Polkadot, mostly due to their longer time in development. Whilst Cosmos has fewer projects, zones are implemented compared to Polkadot which doesn’t currently have parachains. IBC to connect zones and hubs together is due to launch Q2 2021, thus both score ✅✅✅. Avalanche has been in development for a lot shorter time period, but is launching with an impressive feature set right from the start with ability to create subnets, VMs, assets, NFTs, permissioned and permissionless blockchains, cross chain atomic swaps within a subnet, smart contracts, bridge to Ethereum etc. Applications can easily port over from other platforms and use all the existing tooling such as Metamask / Truffle etc but benefit from the performance, decentralisation and low gas fees offered. Currently though just based on the number of projects in comparison it scores ✅.
https://preview.redd.it/4zpi6s85wpq51.png?width=1000&format=png&auto=webp&s=e91ade1a86a5d50f4976f3b23a46e9287b08e373

Enterprise Adoption

Cosmos

Cosmos enables permissioned and permissionless zones which can connect to each other with the ability to have full control over who validates the blockchain. For permissionless zones each zone / hub can have their own token and they are in control who validates.

Polkadot

With polkadot the state transition is performed by a small randomly selected assigned group of validators from the relay chain plus with the possibility that state is rolled back if an invalid transaction of any of the other parachains is found. This may pose a problem for enterprises that need complete control over who performs validation for regulatory reasons. In addition due to the limited number of parachain slots available Enterprises would have to acquire and lock up large amounts of a highly volatile asset (DOT) and have the possibility that they are outbid in future auctions and find they no longer can have their parachain validated and parathreads don’t provide the guaranteed performance requirements for the application to function.

Avalanche

Avalanche enables permissioned and permissionless subnets and complex rulesets can be configured to meet regulatory compliance. For example a subnet can be created where its mandatory that all validators are from a certain legal jurisdiction, or they hold a specific license and regulated by the SEC etc. Subnets are also able to scale to tens of thousands of validators, and even potentially millions of nodes, all participating in consensus so every enterprise can run their own node rather than only a small amount. Enterprises don’t have to hold large amounts of a highly volatile asset, but instead pay a fee in AVAX for the creation of the subnets and blockchains which is burnt.

Results

Avalanche provides the customisability to run private permissioned blockchains as well as permissionless where the enterprise is in control over who validates the blockchain, with the ability to use complex rulesets to meet regulatory compliance, thus scores ✅✅✅. Cosmos is also able to run permissioned and permissionless zones / hubs so enterprises have full control over who validates a blockchain and scores ✅✅. Polkadot requires locking up large amounts of a highly volatile asset with the possibility of being outbid by competitors and being unable to run the application if the guaranteed performance is required and having to migrate away. The relay chain validates the state transition and can roll back the parachain should an invalid block be detected on another parachain, thus scores ✅.
https://preview.redd.it/li5jy6u6wpq51.png?width=1000&format=png&auto=webp&s=e2a95f1f88e5efbcf9e23c789ae0f002c8eb73fc

Interoperability

Cosmos

Cosmos will connect Hubs and Zones together through its IBC protocol (due to release in Q1 2020). Connecting to blockchains outside of the Cosmos ecosystem would either require the connected blockchain to fork their code to implement IBC or more likely a custom “Peg Zone” will be created specific to work with a particular blockchain it’s trying to bridge to such as Ethereum etc. Each Zone and Hub has different trust levels and connectivity between 2 zones can have different trust depending on which path it takes (this is discussed more in this article). Finality time is low at 6 seconds, but depending on the number of hops, this can increase significantly.

Polkadot

Polkadot’s shared state means each parachain that connects shares the same trust assumptions, of the relay chain validators and that if one blockchain needs to be reverted, all of them will need to be reverted. Interoperability is enabled between parachains through Cross-Chain Message Passing (XCMP) protocol and is also possible to connect to other systems through bridges, which are specifically designed parachains or parathreads that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. Finality time between parachains is around 60 seconds, but longer will be needed (initial figures of 60 minutes in the whitepaper) for connecting to external blockchains. Thus limiting the appeal of connecting two external ecosystems together through Polkadot. Polkadot is also limited in the number of Parachain slots available, thus limiting the amount of blockchains that can be bridged. Parathreads could be used for lower performance bridges, but the speed of future blockchains is only going to increase.

Avalanche

A subnet can validate multiple virtual machines / blockchains and all blockchains within a subnet share the same trust assumptions / validator set, enabling cross chain interoperability. Interoperability is also possible between any other subnet, with the hope Avalanche will consist of thousands of subnets. Each subnet may have a different trust level, but as the primary network consists of all validators then this can be used as a source of trust if required. As Avalanche supports many virtual machines, bridges to other ecosystems are created by running the connected virtual machine. There will be an Ethereum bridge using the EVM shortly after mainnet. Finality time is much faster at sub 3 seconds (with most happening under 1 second) with no chance of rolling back so more appealing when connecting to external blockchains.

Results

All 3 systems are able to perform interoperability within their ecosystem and transfer assets as well as data, as well as use bridges to connect to external blockchains. Cosmos has different trust levels between its zones and hubs and can create issues depending on which path it takes and additional latency added. Polkadot provides the same trust assumptions for all connected parachains but has long finality and limited number of parachain slots available. Avalanche provides the same trust assumptions for all blockchains within a subnet, and different trust levels between subnets. However due to the primary network consisting of all validators it can be used for trust. Avalanche also has a much faster finality time with no limitation on the number of blockchains / subnets / bridges that can be created. Overall all three blockchains excel with interoperability within their ecosystem and each score ✅✅.
https://preview.redd.it/ai0bkbq8wpq51.png?width=1000&format=png&auto=webp&s=3e85ee6a3c4670f388ccea00b0c906c3fb51e415

Tokenomics

Cosmos

The ATOM token is the native token for the Cosmos Hub. It is commonly mistaken by people that think it’s the token used throughout the cosmos ecosystem, whereas it’s just used for one of many hubs in Cosmos, each with their own token. Currently ATOM has little utility as IBC isn’t released and has no connections to other zones / hubs. Once IBC is released zones may prefer to connect to a different hub instead and so ATOM is not used. ATOM isn’t a fixed capped supply token and supply will continuously increase with a yearly inflation of around 10% depending on the % staked. The current market cap for ATOM as of the time of this writing is $1 Billion with 203 million circulating supply. Rewards can be earnt through staking to offset the dilution caused by inflation. Delegators can also get slashed and lose a portion of their ATOM should the validator misbehave.

Polkadot

Polkadot’s native token is DOT and it’s used to secure the Relay Chain. Each parachain needs to acquire sufficient DOT to win an auction on an available parachain lease period of up to 24 months at a time. Parathreads have a fixed fee for registration that would realistically be much lower than the cost of acquiring a parachain slot and compete with other parathreads in a per-block auction to have their transactions included in the next relay chain block. DOT isn’t a fixed capped supply token and supply will continuously increase with a yearly inflation of around 10% depending on the % staked. The current market cap for DOT as of the time of this writing is $4.4 Billion with 852 million circulating supply. Delegators can also get slashed and lose their DOT (potentially 100% of their DOT for serious attacks) should the validator misbehave.

Avalanche

AVAX is the native token for the primary network in Avalanche. Every validator of any subnet also has to validate the primary network and stake a minimum of 2000 AVAX. There is no limit to the number of validators like other consensus methods then this can cater for tens of thousands even potentially millions of validators. As every validator validates the primary network, this can be a source of trust for interoperability between subnets as well as connecting to other ecosystems, thus increasing amount of transaction fees of AVAX. There is no slashing in Avalanche, so there is no risk to lose your AVAX when selecting a validator, instead rewards earnt for staking can be slashed should the validator misbehave. Because Avalanche doesn’t have direct slashing, it is technically possible for someone to both stake AND deliver tokens for something like a flash loan, under the invariant that all tokens that are staked are returned, thus being able to make profit with staked tokens outside of staking itself.
There will also be a separate subnet for Athereum which is a ‘spoon,’ or friendly fork, of Ethereum, which benefits from the Avalanche consensus protocol and applications in the Ethereum ecosystem. It’s native token ATH will be airdropped to ETH holders as well as potentially AVAX holders as well. This can be done for other blockchains as well.
Transaction fees on the primary network for all 3 of the blockchains as well as subscription fees for creating a subnet and blockchain are paid in AVAX and are burnt, creating deflationary pressure. AVAX is a fixed capped supply of 720 million tokens, creating scarcity rather than an unlimited supply which continuously increase of tokens at a compounded rate each year like others. Initially there will be 360 tokens minted at Mainnet with vesting periods between 1 and 10 years, with tokens gradually unlocking each quarter. The Circulating supply is 24.5 million AVAX with tokens gradually released each quater. The current market cap of AVAX is around $100 million.

Results

Avalanche’s AVAX with its fixed capped supply, deflationary pressure, very strong utility, potential to receive air drops and low market cap, means it scores ✅✅✅. Polkadot’s DOT also has very strong utility with the need for auctions to acquire parachain slots, but has no deflationary mechanisms, no fixed capped supply and already valued at $3.8 billion, therefore scores ✅✅. Cosmos’s ATOM token is only for the Cosmos Hub, of which there will be many hubs in the ecosystem and has very little utility currently. (this may improve once IBC is released and if Cosmos hub actually becomes the hub that people want to connect to and not something like Binance instead. There is no fixed capped supply and currently valued at $1.1 Billion, so scores ✅.
https://preview.redd.it/mels7myawpq51.png?width=1000&format=png&auto=webp&s=df9782e2c0a4c26b61e462746256bdf83b1fb906
All three are excellent projects and have similarities as well as many differences. Just to reiterate this article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important. For a more in-depth view I recommend reading the articles for each of the projects linked above and coming to your own conclusions, you may have different criteria which is important to you, and score them differently. There won’t be one platform to rule them all however, with some uses cases better suited to one platform over another, and it’s not a zero-sum game. Blockchain is going to completely revolutionize industries and the Internet itself. The more projects researching and delivering breakthrough technology the better, each learning from each other and pushing each other to reach that goal earlier. The current market is a tiny speck of what’s in store in terms of value and adoption and it’s going to be exciting to watch it unfold.
https://preview.redd.it/dbb99egcwpq51.png?width=1388&format=png&auto=webp&s=aeb03127dc0dc74d0507328e899db1c7d7fc2879
For more information see the articles below (each with additional sources at the bottom of their articles)
Avalanche, a Revolutionary Consensus Engine and Platform. A Game Changer for Blockchain
Avalanche Consensus, The Biggest Breakthrough since Nakamoto
Cosmos — An Early In-Depth Analysis — Part One
Cosmos — An Early In-Depth Analysis — Part Two
Cosmos Hub ATOM Token and the commonly misunderstood staking tokens — Part Three
Polkadot — An Early In-Depth Analysis — Part One — Overview and Benefits
Polkadot — An Early In-Depth Analysis — Part Two — How Consensus Works
Polkadot — An Early In-Depth Analysis — Part Three — Limitations and Issues
submitted by xSeq22x to CryptoCurrency [link] [comments]

d down, k up, everybody's a game theorist, titcoin, build wiki on Cardano, (e-)voting, competitive marketing analysis, Goguen product update, Alexa likes Charles, David hates all, Adam in and bros in arms with the scientific counterparts of the major cryptocurrency groups, the latest AMA for all!

Decreasing d parameter
Just signed the latest change management document, I was the last in the chain so I signed it today for changing the d parameter from 0.52 to 0.5. That means we are just about to cross the threshold here in a little bit for d to fall below 0.5 which means more than half of all the blocks will be made by the community and not the OBFT nodes. That's a major milestone and at this current rate of velocity it looks like d will decrement to zero around March so lots to do, lots to talk about. Product update, two days from now, we'll go ahead and talk about that but it crossed my desk today and I was really happy and excited about that and it seemed like yesterday that d was equal to one and people were complaining that we delayed it by an epoch and now we're almost at 50 percent. For those of you who want parameter-level changes, k-level changes, they are coming and there's an enormous internal conversation about it and we've written up a powerpoint presentation and a philosophy document about why things were designed the way that they're designed.
Increasing k parameter and upcoming security video and everybody's a game theorist
My chief scientist has put an enormous amount of time into this. Aggelos is very passionate about this particular topic and what I'm going to do is similar to the security video that I did where I did an hour and a half discussion about a best practice for security. I'm going to actually do a screencasted video where I talk about this philosophy document and I'm going to read the entire document with annotations with you guys and kind of talk through it. It might end up being quite a long video. It could be several hours long but I think it's really important to talk around the design philosophy of this. It's kind of funny, everybody, when they see a cryptographic paper or math paper, they tend to just say okay you guys figure that out. No one's an expert in cryptography or math and you don't really get strong opinions about it but game theory despite the fact that the topics as complex and in some cases more complex you tend to get a lot of opinions and everybody's a game theorist. So, there was enormous amount of thought that went into the design of the system, the parameters of system, everything from the reward functions to other things and it's very important that we explain that thought process in as detailed of a way as possible. At least the philosophy behind it then I feel that the community is in a really good position to start working on the change management. It is my position that I'd love to see k largely increased. I do think that the software needs some improvements to get there especially partial delegation delegation portfolios and some enhancements into the operation of staking especially.
E-voting
I'd love to see the existence of hybrid wallets where you have a cold part a hot part and we've had a lot of conversations about that and we will present some of the progress in that matter at the product updates. If not this October certainly in November. A lot of commercialization going along, a lot of things going on and flowing around and you know, commercial teams working hard. As I mentioned we have a lot of deals in the pipeline. The Wyoming event was half political, half sales. We were really looking into e-voting and we had very productive conversations along those lines. It is my goal that Cardano e-voting software is used in political primaries and my hope is for eventually to be used in municipal and state and eventually federal elections and then in national elections for countries like Ethiopia, Mongolia and other places. Now there is a long road, long, long road to get there and many little victories that have to begin but this event. Wyoming was kind of the opener into that conversation there were seven independent parties at the independent national convention and we had a chance to talk to the leadership of many of them. We will also engage in conversation with the libertarian party leadership as well and at the very least we could talk about e-voting and also blockchain-based voting for primaries that would be great start and we'll also look into the state of Wyoming for that as well. We'll you know, tell you guys about that in time. We've already gotten a lot of inquiries about e-voting software. We tend to get them along with the (Atala) Prism inquiries. It's actually quite easy to start conversations but there are a lot of security properties that are very important like end-to-end verifiability hybrid ballots where you have both a digital and a paper ballot delegation mechanics as well as privacy mechanics that are interesting on a case-by-case basis.
Goguen, voting, future fund3, competitive marketing analysis of Ouroboros vs. EOS, Tezos, Algorand, ETH2 and Polkadot, new creative director
We'll keep chipping away at that, a lot of Goguen stuff to talk about but I'm going to reserve all of that for two days from now for the product update. We're right in the middle, Goguen metadata was the very first part of it. We already have some commercialization platform as a result of metadata, more to come and then obviously lots of smart contract stuff to come. This update and the November update are going to be very Goguen focused and also a lot of alternatives as well. We're still on schedule for an HFC event in I think November or December. I can't remember but that's going to be carrying a lot of things related multisig token locking. There's some ledger rule changes so it has to be an HFC event and that opens up a lot of the windows for Goguen foundations as well as voting on chain so fund3 will benefit very heavily from that. We're right in the guts of Daedalus right now building the voting center, the identity center, QR-code work. All this stuff, it's a lot of stuff, you know, the cell phone app was released last week. Kind of an early beta, it'll go through a lot of rapid iterations every few weeks. We'll update it, google play is a great foundation to launch things on because it's so easy to push updates to people automatically so you can rapidly iterate and be very agile in that framework and you know we've already had 3500 people involved heavily in the innovation management platform ideascale and we've got numerous bids from everything. From John Buck and the sociocracy movement to others. A lot of people want to help us improve that and we're going to see steady and systematic growth there. We're still chipping away at product marketing. Liza (Horowitz) is doing a good job, meet with her two three-times a week and right now it's Ouroboros, Ouroboros, Ouroboros... We're doing competitive analysis of Ouroboros versus EOS, Tezos, Algorand, ETH2 and Polkadot. We think that's a good set. We think we have a really good way of explaining it. David (David Likes Crypto now at IOHK) has already made some great content. We're going to release that soon alongside some other content and we'll keep chipping away at that.
We also just hired a creative director for IO Global. His name's Adam, incredibly experienced creative director, he's worked for Mercedes-Benz and dozens of other companies. He does very good work and he's been doing this for well over 20 years and so the very first set of things he's going to do is work with commercial and marketing on product marketing. In addition to building great content where hope is make that content as pretty as possible and we have Rod heavily involved in that as well to talk about distribution channels and see if we can amplify the distribution message and really get a lot of stuff done. Last thing to mention, oh yeah, iOS for catalyst. We're working on that, we submitted it to the apple store, the iOS store, but it takes a little longer to get approval for that than it does with google play but that's been submitted and it's whenever apple approves it or not. Takes a little longer for cryptocurrency stuff.
Wiki shizzle and battle for crypto, make crypto articles on wiki great again, Alexa knows Charles, Everpedia meets Charles podcast, holy-grail land of Cardano, wiki on Cardano, titcoin
Wikipedia... kind of rattled the cage a little bit. Through an intermediary we got contact with Jimmy Wales. Larry Sanger, the other co-founder also reached out to me and the everpedia guys reached out to me. Here's where we stand, we have an article, it has solidified, it's currently labeled as unreliable and you should not believe the things that are said in it which is David Gerard's work if you look at the edits. We will work with the community and try to get that article to a fair and balanced representation of Cardano and especially after the product marketing comes through. We clearly explain the product I think the Cardano article can be massively strengthened. I've told Rod to work with some specialized people to try to get that done but we are going to work very hard at a systematic approval campaign for all of the scientific articles related to blockchain technology in the cryptocurrency space. They're just terrible, if you go to the proof of work article, the proof of stake or all these things, they're just terrible. They're not well written, they're out of date and they don't reflect an adequate sampling of the science. I did talk to my chief scientist Aggelos and what we're gonna do is reach out to the scientific counterparts that most of the major cryptocurrency groups that are doing research and see if they want to work with us at an industry-wide effort to systematically improve the scientific articles in our industry so that there are a fair and balanced representation of what the current state of the art are, the criticisms, the trade-offs as well as the reference space and of course obviously we'll do quite well in that respect because we've done the science. We're the inheritor of it but it's a shame because when people search proof of stake on google usually wikipedia results are highly biased. We care about wikipedia because google cares about wikipedia, amazon cares about wikipedia.
If you ask Alexa who is Charles Hoskinson, the reason why Alexa knows is because it's reading directly from the wikipedia page. If I didn't have a wikipedia page Alexa would know that so if somebody says Alexa what is Cardano it's going to read directly from the wikipedia page and you know and we can either just pretend that reality doesn't exist or we can accept it and we as a community working with partners in the broader cryptocurrency community can universally improve the quality of cryptocurrency pages. There's been a pattern of commercial censorship on wikipedia for cryptocurrencies in general since bitcoin itself. In fact I think the bitcoin article is actually taken down once back in, might have been, 2010 or 2009 but basically wikipedia has not been a friend of cryptocurrencies. That's why everpedia exists and actually their founders reached out to me and I talked to them over twitter through PMs and we agreed to actually do a podcast. I'm going to do a streamyard, stream with these guys and they'll come on talk all about everpedia and what they do and how they are and we'll kind of go through the challenges that they've encountered. How their platform works and so forth and obviously if they want to ever leave that terrible ecosystem EOS and come to the holy-grail land of Cardano we'd be there to help them out. At least they can tell the world how amazing their product is and also the challenges they're having to overcome. We've also been in great contact with Larry Sanger.
He's going to do an internal seminar at some point with with us and talk about some protocols he's been developing since he left wikipedia specifically to decentralize knowledge management and have a truly decentralized encyclopedia. I'm really looking forward to that and I hope that presentation gives us some inspiration as an ecosystem of things we can do. That's a great piece of infrastructure regardless and after we learn a lot more about it and we talk to a lot of people in ecosystem. If we can't get people to move on over, it would be really good to see through ideascale in the innovation management platform for people to utilize the dc fund to build their own variant of wikipedia on Cardano. In the coming months there will certainly be funding available. If you guys are so passionate about this particular problem that you want to go solve it then I'd be happy to play Elon Musk with the hyperloop and write a white paper on a protocol design and really give a good first start and then you guys can go and try to commercialize that technology as Cardano native assets and Plutus smart contracts in addition to other pieces of technology that have to be brought in to make it practical.
Right now we're just, let's talk to everybody phase, and we'll talk to the everpedia guys, we're going to talk to Larry and we're going to see whoever else is in this game and of course we have to accept the incumbency as it is. So, we're working with obviously the wikipedia side to improve the quality of not only our article but all of the articles and the scientific side of things so that there's a fair and accurate representation of information. One of the reasons why I'm so concerned about this is that I am very worried that Cardano projects will get commercially censored like we were commercially censored. So, yes we do have a page but it took five years to get there and we're a multi-billion dollar project with hundreds of thousands of people. If you guys are doing cutting-edge novel interesting stuff I don't want your experience to be the same as ours where you have to wait five years for your project to get a page even after government's adopted. That's absurd, no one should be censored ever. This is very well a fight for the entire ecosystem, the entire community, not just Cardano but all cryptocurrencies: bitcoin, ethereum and Cardano have all faced commercial censorship and article deletions during their tenure so I don't want you guys to go through that. I'm hoping we can prove that situation but you know you don't put all your eggs in one basket and frankly the time has come for wikipedia to be fully decentralized and liberated from a centralized organization and massively variable quality in the editor base. If legends of valor has a page but Cardano didn't have one until recently titcoin, a pornography coin from 2015, that's deprecated, no one uses it, has a page but Cardano couldn't get one there's something seriously wrong with the quality control mechanism and we need to improve that so it'll get done.
submitted by stake_pool to cardano [link] [comments]

[ CryptoCurrency ] Comparison between Avalanche, Cosmos and Polkadot

[ 🔴 DELETED 🔴 ] Topic originally posted in CryptoCurrency by xSeq22x [link]
A frequent question I see being asked is how Cosmos, Polkadot and Avalanche compare? Whilst there are similarities there are also a lot of differences. This article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important.
For better formatting see https://medium.com/ava-hub/comparison-between-avalanche-cosmos-and-polkadot-a2a98f46c03b
https://preview.redd.it/lg16iwk2dhq51.png?width=428&format=png&auto=webp&s=6c899ee69800dd6c5e2900d8fa83de7a43c57086

Overview

Cosmos

Cosmos is a heterogeneous network of many independent parallel blockchains, each powered by classical BFT consensus algorithms like Tendermint. Developers can easily build custom application specific blockchains, called Zones, through the Cosmos SDK framework. These Zones connect to Hubs, which are specifically designed to connect zones together.
The vision of Cosmos is to have thousands of Zones and Hubs that are Interoperable through the Inter-Blockchain Communication Protocol (IBC). Cosmos can also connect to other systems through peg zones, which are specifically designed zones that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. Cosmos does not use Sharding with each Zone and Hub being sovereign with their own validator set.
For a more in-depth look at Cosmos and provide more reference to points made in this article, please see my three part series — Part One, Part Two, Part Three
https://youtu.be/Eb8xkDi_PUg

Polkadot

Polkadot is a heterogeneous blockchain protocol that connects multiple specialised blockchains into one unified network. It achieves scalability through a sharding infrastructure with multiple blockchains running in parallel, called parachains, that connect to a central chain called the Relay Chain. Developers can easily build custom application specific parachains through the Substrate development framework.
The relay chain validates the state transition of connected parachains, providing shared state across the entire ecosystem. If the Relay Chain must revert for any reason, then all of the parachains would also revert. This is to ensure that the validity of the entire system can persist, and no individual part is corruptible. The shared state makes it so that the trust assumptions when using parachains are only those of the Relay Chain validator set, and no other. Interoperability is enabled between parachains through Cross-Chain Message Passing (XCMP) protocol and is also possible to connect to other systems through bridges, which are specifically designed parachains or parathreads that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. The hope is to have 100 parachains connect to the relay chain.
For a more in-depth look at Polkadot and provide more reference to points made in this article, please see my three part series — Part One, Part Two, Part Three
https://youtu.be/_-k0xkooSlA

Avalanche

Avalanche is a platform of platforms, ultimately consisting of thousands of subnets to form a heterogeneous interoperable network of many blockchains, that takes advantage of the revolutionary Avalanche Consensus protocols to provide a secure, globally distributed, interoperable and trustless framework offering unprecedented decentralisation whilst being able to comply with regulatory requirements.
Avalanche allows anyone to create their own tailor-made application specific blockchains, supporting multiple custom virtual machines such as EVM and WASM and written in popular languages like Go (with others coming in the future) rather than lightly used, poorly-understood languages like Solidity. This virtual machine can then be deployed on a custom blockchain network, called a subnet, which consist of a dynamic set of validators working together to achieve consensus on the state of a set of many blockchains where complex rulesets can be configured to meet regulatory compliance.
Avalanche was built with serving financial markets in mind. It has native support for easily creating and trading digital smart assets with complex custom rule sets that define how the asset is handled and traded to ensure regulatory compliance can be met. Interoperability is enabled between blockchains within a subnet as well as between subnets. Like Cosmos and Polkadot, Avalanche is also able to connect to other systems through bridges, through custom virtual machines made to interact with another ecosystem such as Ethereum and Bitcoin.
For a more in-depth look at Avalanche and provide more reference to points made in this article, please see here and here
https://youtu.be/mWBzFmzzBAg

Comparison between Cosmos, Polkadot and Avalanche

A frequent question I see being asked is how Cosmos, Polkadot and Avalanche compare? Whilst there are similarities there are also a lot of differences. This article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important. For a more in-depth view I recommend reading the articles for each of the projects linked above and coming to your own conclusions. I want to stress that it’s not a case of one platform being the killer of all other platforms, far from it. There won’t be one platform to rule them all, and too often the tribalism has plagued this space. Blockchains are going to completely revolutionise most industries and have a profound effect on the world we know today. It’s still very early in this space with most adoption limited to speculation and trading mainly due to the limitations of Blockchain and current iteration of Ethereum, which all three of these platforms hope to address. For those who just want a quick summary see the image at the bottom of the article. With that said let’s have a look

Scalability

Cosmos

Each Zone and Hub in Cosmos is capable of up to around 1000 transactions per second with bandwidth being the bottleneck in consensus. Cosmos aims to have thousands of Zones and Hubs all connected through IBC. There is no limit on the number of Zones / Hubs that can be created

Polkadot

Parachains in Polkadot are also capable of up to around 1500 transactions per second. A portion of the parachain slots on the Relay Chain will be designated as part of the parathread pool, the performance of a parachain is split between many parathreads offering lower performance and compete amongst themselves in a per-block auction to have their transactions included in the next relay chain block. The number of parachains is limited by the number of validators on the relay chain, they hope to be able to achieve 100 parachains.

Avalanche

Avalanche is capable of around 4500 transactions per second per subnet, this is based on modest hardware requirements to ensure maximum decentralisation of just 2 CPU cores and 4 GB of Memory and with a validator size of over 2,000 nodes. Performance is CPU-bound and if higher performance is required then more specialised subnets can be created with higher minimum requirements to be able to achieve 10,000 tps+ in a subnet. Avalanche aims to have thousands of subnets (each with multiple virtual machines / blockchains) all interoperable with each other. There is no limit on the number of Subnets that can be created.

Results

All three platforms offer vastly superior performance to the likes of Bitcoin and Ethereum 1.0. Avalanche with its higher transactions per second, no limit on the number of subnets / blockchains that can be created and the consensus can scale to potentially millions of validators all participating in consensus scores ✅✅✅. Polkadot claims to offer more tps than cosmos, but is limited to the number of parachains (around 100) whereas with Cosmos there is no limit on the number of hubs / zones that can be created. Cosmos is limited to a fairly small validator size of around 200 before performance degrades whereas Polkadot hopes to be able to reach 1000 validators in the relay chain (albeit only a small number of validators are assigned to each parachain). Thus Cosmos and Polkadot scores ✅✅
https://preview.redd.it/ththwq5qdhq51.png?width=1000&format=png&auto=webp&s=92f75152c90d984911db88ed174ebf3a147ca70d

Decentralisation

Cosmos

Tendermint consensus is limited to around 200 validators before performance starts to degrade. Whilst there is the Cosmos Hub it is one of many hubs in the network and there is no central hub or limit on the number of zones / hubs that can be created.

Polkadot

Polkadot has 1000 validators in the relay chain and these are split up into a small number that validate each parachain (minimum of 14). The relay chain is a central point of failure as all parachains connect to it and the number of parachains is limited depending on the number of validators (they hope to achieve 100 parachains). Due to the limited number of parachain slots available, significant sums of DOT will need to be purchased to win an auction to lease the slot for up to 24 months at a time. Thus likely to lead to only those with enough funds to secure a parachain slot. Parathreads are however an alternative for those that require less and more varied performance for those that can’t secure a parachain slot.

Avalanche

Avalanche consensus scan scale to tens of thousands of validators, even potentially millions of validators all participating in consensus through repeated sub-sampling. The more validators, the faster the network becomes as the load is split between them. There are modest hardware requirements so anyone can run a node and there is no limit on the number of subnets / virtual machines that can be created.

Results

Avalanche offers unparalleled decentralisation using its revolutionary consensus protocols that can scale to millions of validators all participating in consensus at the same time. There is no limit to the number of subnets and virtual machines that can be created, and they can be created by anyone for a small fee, it scores ✅✅✅. Cosmos is limited to 200 validators but no limit on the number of zones / hubs that can be created, which anyone can create and scores ✅✅. Polkadot hopes to accommodate 1000 validators in the relay chain (albeit these are split amongst each of the parachains). The number of parachains is limited and maybe cost prohibitive for many and the relay chain is a ultimately a single point of failure. Whilst definitely not saying it’s centralised and it is more decentralised than many others, just in comparison between the three, it scores ✅
https://preview.redd.it/lv2h7g9sdhq51.png?width=1000&format=png&auto=webp&s=56eada6e8c72dbb4406d7c5377ad15608bcc730e

Latency

Cosmos

Tendermint consensus used in Cosmos reaches finality within 6 seconds. Cosmos consists of many Zones and Hubs that connect to each other. Communication between 2 zones could pass through many hubs along the way, thus also can contribute to latency times depending on the path taken as explained in part two of the articles on Cosmos. It doesn’t need to wait for an extended period of time with risk of rollbacks.

Polkadot

Polkadot provides a Hybrid consensus protocol consisting of Block producing protocol, BABE, and then a finality gadget called GRANDPA that works to agree on a chain, out of many possible forks, by following some simpler fork choice rule. Rather than voting on every block, instead it reaches agreements on chains. As soon as more than 2/3 of validators attest to a chain containing a certain block, all blocks leading up to that one are finalized at once.
If an invalid block is detected after it has been finalised then the relay chain would need to be reverted along with every parachain. This is particularly important when connecting to external blockchains as those don’t share the state of the relay chain and thus can’t be rolled back. The longer the time period, the more secure the network is, as there is more time for additional checks to be performed and reported but at the expense of finality. Finality is reached within 60 seconds between parachains but for external ecosystems like Ethereum their state obviously can’t be rolled back like a parachain and so finality will need to be much longer (60 minutes was suggested in the whitepaper) and discussed in more detail in part three

Avalanche

Avalanche consensus achieves finality within 3 seconds, with most happening sub 1 second, immutable and completely irreversible. Any subnet can connect directly to another without having to go through multiple hops and any VM can talk to another VM within the same subnet as well as external subnets. It doesn’t need to wait for an extended period of time with risk of rollbacks.

Results

With regards to performance far too much emphasis is just put on tps as a metric, the other equally important metric, if not more important with regards to finance is latency. Throughput measures the amount of data at any given time that it can handle whereas latency is the amount of time it takes to perform an action. It’s pointless saying you can process more transactions per second than VISA when it takes 60 seconds for a transaction to complete. Low latency also greatly increases general usability and customer satisfaction, nowadays everyone expects card payments, online payments to happen instantly. Avalanche achieves the best results scoring ✅✅✅, Cosmos with comes in second with 6 second finality ✅✅ and Polkadot with 60 second finality (which may be 60 minutes for external blockchains) scores ✅
https://preview.redd.it/qe8e5ltudhq51.png?width=1000&format=png&auto=webp&s=18a2866104590f81a818690337f9121161dda890

Shared Security

Cosmos

Every Zone and Hub in Cosmos has their own validator set and different trust assumptions. Cosmos are researching a shared security model where a Hub can validate the state of connected zones for a fee but not released yet. Once available this will make shared security optional rather than mandatory.

Polkadot

Shared Security is mandatory with Polkadot which uses a Shared State infrastructure between the Relay Chain and all of the connected parachains. If the Relay Chain must revert for any reason, then all of the parachains would also revert. Every parachain makes the same trust assumptions, and as such the relay chain validates state transition and enables seamless interoperability between them. In return for this benefit, they have to purchase DOT and win an auction for one of the available parachain slots.
However, parachains can’t just rely on the relay chain for their security, they will also need to implement censorship resistance measures and utilise proof of work / proof of stake for each parachain as well as discussed in part three, thus parachains can’t just rely on the security of the relay chain, they need to ensure sybil resistance mechanisms using POW and POS are implemented on the parachain as well.

Avalanche

A subnet in Avalanche consists of a dynamic set of validators working together to achieve consensus on the state of a set of many blockchains where complex rulesets can be configured to meet regulatory compliance. So unlike in Cosmos where each zone / hub has their own validators, A subnet can validate a single or many virtual machines / blockchains with a single validator set. Shared security is optional

Results

Shared security is mandatory in polkadot and a key design decision in its infrastructure. The relay chain validates the state transition of all connected parachains and thus scores ✅✅✅. Subnets in Avalanche can validate state of either a single or many virtual machines. Each subnet can have their own token and shares a validator set, where complex rulesets can be configured to meet regulatory compliance. It scores ✅ ✅. Every Zone and Hub in cosmos has their own validator set / token but research is underway to have the hub validate the state transition of connected zones, but as this is still early in the research phase scores ✅ for now.
https://preview.redd.it/0mnvpnzwdhq51.png?width=1000&format=png&auto=webp&s=8927ff2821415817265be75c59261f83851a2791

Current Adoption

Cosmos

The Cosmos project started in 2016 with an ICO held in April 2017. There are currently around 50 projects building on the Cosmos SDK with a full list can be seen here and filtering for Cosmos SDK . Not all of the projects will necessarily connect using native cosmos sdk and IBC and some have forked parts of the Cosmos SDK and utilise the tendermint consensus such as Binance Chain but have said they will connect in the future.

Polkadot

The Polkadot project started in 2016 with an ICO held in October 2017. There are currently around 70 projects building on Substrate and a full list can be seen here and filtering for Substrate Based. Like with Cosmos not all projects built using substrate will necessarily connect to Polkadot and parachains or parathreads aren’t currently implemented in either the Live or Test network (Kusama) as of the time of this writing.

Avalanche

Avalanche in comparison started much later with Ava Labs being founded in 2018. Avalanche held it’s ICO in July 2020. Due to lot shorter time it has been in development, the number of projects confirmed are smaller with around 14 projects currently building on Avalanche. Due to the customisability of the platform though, many virtual machines can be used within a subnet making the process incredibly easy to port projects over. As an example, it will launch with the Ethereum Virtual Machine which enables byte for byte compatibility and all the tooling like Metamask, Truffle etc. will work, so projects can easily move over to benefit from the performance, decentralisation and low gas fees offered. In the future Cosmos and Substrate virtual machines could be implemented on Avalanche.

Results

Whilst it’s still early for all 3 projects (and the entire blockchain space as a whole), there is currently more projects confirmed to be building on Cosmos and Polkadot, mostly due to their longer time in development. Whilst Cosmos has fewer projects, zones are implemented compared to Polkadot which doesn’t currently have parachains. IBC to connect zones and hubs together is due to launch Q2 2021, thus both score ✅✅✅. Avalanche has been in development for a lot shorter time period, but is launching with an impressive feature set right from the start with ability to create subnets, VMs, assets, NFTs, permissioned and permissionless blockchains, cross chain atomic swaps within a subnet, smart contracts, bridge to Ethereum etc. Applications can easily port over from other platforms and use all the existing tooling such as Metamask / Truffle etc but benefit from the performance, decentralisation and low gas fees offered. Currently though just based on the number of projects in comparison it scores ✅.
https://preview.redd.it/rsctxi6zdhq51.png?width=1000&format=png&auto=webp&s=ff762dea3cfc2aaaa3c8fc7b1070d5be6759aac2

Enterprise Adoption

Cosmos

Cosmos enables permissioned and permissionless zones which can connect to each other with the ability to have full control over who validates the blockchain. For permissionless zones each zone / hub can have their own token and they are in control who validates.

Polkadot

With polkadot the state transition is performed by a small randomly selected assigned group of validators from the relay chain plus with the possibility that state is rolled back if an invalid transaction of any of the other parachains is found. This may pose a problem for enterprises that need complete control over who performs validation for regulatory reasons. In addition due to the limited number of parachain slots available Enterprises would have to acquire and lock up large amounts of a highly volatile asset (DOT) and have the possibility that they are outbid in future auctions and find they no longer can have their parachain validated and parathreads don’t provide the guaranteed performance requirements for the application to function.

Avalanche

Avalanche enables permissioned and permissionless subnets and complex rulesets can be configured to meet regulatory compliance. For example a subnet can be created where its mandatory that all validators are from a certain legal jurisdiction, or they hold a specific license and regulated by the SEC etc. Subnets are also able to scale to tens of thousands of validators, and even potentially millions of nodes, all participating in consensus so every enterprise can run their own node rather than only a small amount. Enterprises don’t have to hold large amounts of a highly volatile asset, but instead pay a fee in AVAX for the creation of the subnets and blockchains which is burnt.

Results

Avalanche provides the customisability to run private permissioned blockchains as well as permissionless where the enterprise is in control over who validates the blockchain, with the ability to use complex rulesets to meet regulatory compliance, thus scores ✅✅✅. Cosmos is also able to run permissioned and permissionless zones / hubs so enterprises have full control over who validates a blockchain and scores ✅✅. Polkadot requires locking up large amounts of a highly volatile asset with the possibility of being outbid by competitors and being unable to run the application if the guaranteed performance is required and having to migrate away. The relay chain validates the state transition and can roll back the parachain should an invalid block be detected on another parachain, thus scores ✅.
https://preview.redd.it/7phaylb1ehq51.png?width=1000&format=png&auto=webp&s=d86d2ec49de456403edbaf27009ed0e25609fbff

Interoperability

Cosmos

Cosmos will connect Hubs and Zones together through its IBC protocol (due to release in Q1 2020). Connecting to blockchains outside of the Cosmos ecosystem would either require the connected blockchain to fork their code to implement IBC or more likely a custom “Peg Zone” will be created specific to work with a particular blockchain it’s trying to bridge to such as Ethereum etc. Each Zone and Hub has different trust levels and connectivity between 2 zones can have different trust depending on which path it takes (this is discussed more in this article). Finality time is low at 6 seconds, but depending on the number of hops, this can increase significantly.

Polkadot

Polkadot’s shared state means each parachain that connects shares the same trust assumptions, of the relay chain validators and that if one blockchain needs to be reverted, all of them will need to be reverted. Interoperability is enabled between parachains through Cross-Chain Message Passing (XCMP) protocol and is also possible to connect to other systems through bridges, which are specifically designed parachains or parathreads that each are custom made to interact with another ecosystem such as Ethereum and Bitcoin. Finality time between parachains is around 60 seconds, but longer will be needed (initial figures of 60 minutes in the whitepaper) for connecting to external blockchains. Thus limiting the appeal of connecting two external ecosystems together through Polkadot. Polkadot is also limited in the number of Parachain slots available, thus limiting the amount of blockchains that can be bridged. Parathreads could be used for lower performance bridges, but the speed of future blockchains is only going to increase.

Avalanche

A subnet can validate multiple virtual machines / blockchains and all blockchains within a subnet share the same trust assumptions / validator set, enabling cross chain interoperability. Interoperability is also possible between any other subnet, with the hope Avalanche will consist of thousands of subnets. Each subnet may have a different trust level, but as the primary network consists of all validators then this can be used as a source of trust if required. As Avalanche supports many virtual machines, bridges to other ecosystems are created by running the connected virtual machine. There will be an Ethereum bridge using the EVM shortly after mainnet. Finality time is much faster at sub 3 seconds (with most happening under 1 second) with no chance of rolling back so more appealing when connecting to external blockchains.

Results

All 3 systems are able to perform interoperability within their ecosystem and transfer assets as well as data, as well as use bridges to connect to external blockchains. Cosmos has different trust levels between its zones and hubs and can create issues depending on which path it takes and additional latency added. Polkadot provides the same trust assumptions for all connected parachains but has long finality and limited number of parachain slots available. Avalanche provides the same trust assumptions for all blockchains within a subnet, and different trust levels between subnets. However due to the primary network consisting of all validators it can be used for trust. Avalanche also has a much faster finality time with no limitation on the number of blockchains / subnets / bridges that can be created. Overall all three blockchains excel with interoperability within their ecosystem and each score ✅✅.
https://preview.redd.it/l775gue3ehq51.png?width=1000&format=png&auto=webp&s=b7c4b5802ceb1a9307bd2a8d65f393d1bcb0d7c6

Tokenomics

Cosmos

The ATOM token is the native token for the Cosmos Hub. It is commonly mistaken by people that think it’s the token used throughout the cosmos ecosystem, whereas it’s just used for one of many hubs in Cosmos, each with their own token. Currently ATOM has little utility as IBC isn’t released and has no connections to other zones / hubs. Once IBC is released zones may prefer to connect to a different hub instead and so ATOM is not used. ATOM isn’t a fixed capped supply token and supply will continuously increase with a yearly inflation of around 10% depending on the % staked. The current market cap for ATOM as of the time of this writing is $1 Billion with 203 million circulating supply. Rewards can be earnt through staking to offset the dilution caused by inflation. Delegators can also get slashed and lose a portion of their ATOM should the validator misbehave.

Polkadot

Polkadot’s native token is DOT and it’s used to secure the Relay Chain. Each parachain needs to acquire sufficient DOT to win an auction on an available parachain lease period of up to 24 months at a time. Parathreads have a fixed fee for registration that would realistically be much lower than the cost of acquiring a parachain slot and compete with other parathreads in a per-block auction to have their transactions included in the next relay chain block. DOT isn’t a fixed capped supply token and supply will continuously increase with a yearly inflation of around 10% depending on the % staked. The current market cap for DOT as of the time of this writing is $4.4 Billion with 852 million circulating supply. Delegators can also get slashed and lose their DOT (potentially 100% of their DOT for serious attacks) should the validator misbehave.

Avalanche

AVAX is the native token for the primary network in Avalanche. Every validator of any subnet also has to validate the primary network and stake a minimum of 2000 AVAX. There is no limit to the number of validators like other consensus methods then this can cater for tens of thousands even potentially millions of validators. As every validator validates the primary network, this can be a source of trust for interoperability between subnets as well as connecting to other ecosystems, thus increasing amount of transaction fees of AVAX. There is no slashing in Avalanche, so there is no risk to lose your AVAX when selecting a validator, instead rewards earnt for staking can be slashed should the validator misbehave. Because Avalanche doesn’t have direct slashing, it is technically possible for someone to both stake AND deliver tokens for something like a flash loan, under the invariant that all tokens that are staked are returned, thus being able to make profit with staked tokens outside of staking itself.
There will also be a separate subnet for Athereum which is a ‘spoon,’ or friendly fork, of Ethereum, which benefits from the Avalanche consensus protocol and applications in the Ethereum ecosystem. It’s native token ATH will be airdropped to ETH holders as well as potentially AVAX holders as well. This can be done for other blockchains as well.
Transaction fees on the primary network for all 3 of the blockchains as well as subscription fees for creating a subnet and blockchain are paid in AVAX and are burnt, creating deflationary pressure. AVAX is a fixed capped supply of 720 million tokens, creating scarcity rather than an unlimited supply which continuously increase of tokens at a compounded rate each year like others. Initially there will be 360 tokens minted at Mainnet with vesting periods between 1 and 10 years, with tokens gradually unlocking each quarter. The Circulating supply is 24.5 million AVAX with tokens gradually released each quater. The current market cap of AVAX is around $100 million.

Results

Avalanche’s AVAX with its fixed capped supply, deflationary pressure, very strong utility, potential to receive air drops and low market cap, means it scores ✅✅✅. Polkadot’s DOT also has very strong utility with the need for auctions to acquire parachain slots, but has no deflationary mechanisms, no fixed capped supply and already valued at $3.8 billion, therefore scores ✅✅. Cosmos’s ATOM token is only for the Cosmos Hub, of which there will be many hubs in the ecosystem and has very little utility currently. (this may improve once IBC is released and if Cosmos hub actually becomes the hub that people want to connect to and not something like Binance instead. There is no fixed capped supply and currently valued at $1.1 Billion, so scores ✅.
https://preview.redd.it/zb72eto5ehq51.png?width=1000&format=png&auto=webp&s=0ee102a2881d763296ad9ffba20667f531d2fd7a
All three are excellent projects and have similarities as well as many differences. Just to reiterate this article is not intended to be an extensive in-depth list, but rather an overview based on some of the criteria that I feel are most important. For a more in-depth view I recommend reading the articles for each of the projects linked above and coming to your own conclusions, you may have different criteria which is important to you, and score them differently. There won’t be one platform to rule them all however, with some uses cases better suited to one platform over another, and it’s not a zero-sum game. Blockchain is going to completely revolutionize industries and the Internet itself. The more projects researching and delivering breakthrough technology the better, each learning from each other and pushing each other to reach that goal earlier. The current market is a tiny speck of what’s in store in terms of value and adoption and it’s going to be exciting to watch it unfold.
https://preview.redd.it/fwi3clz7ehq51.png?width=1388&format=png&auto=webp&s=c91c1645a4c67defd5fc3aaec84f4a765e1c50b6
xSeq22x your post has been copied because one or more comments in this topic have been removed. This copy will preserve unmoderated topic. If you would like to opt-out, please send a message using [this link].
submitted by anticensor_bot to u/anticensor_bot [link] [comments]

Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:

Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:

Gains x Streamr AMA Recap

https://preview.redd.it/o74jlxia8im51.png?width=1236&format=png&auto=webp&s=93eb37a3c9ed31dc3bf31c91295c6ee32e1582be
Thanks to everyone in our community who attended the GAINS AMA yesterday with, Shiv Malik. We were excited to see that so many people attended and gladly overwhelmed by the amount of questions we got from you on Twitter and Telegram. We decided to do a little recap of the session for anyone who missed it, and to archive some points we haven’t previously discussed with our community. Happy reading and thanks to Alexandre and Henry for having us on their channel!
What is the project about in a few simple sentences?
At Streamr we are building a real-time network for tomorrow’s data economy. It’s a decentralized, peer-to-peer network which we are hoping will one day replace centralized message brokers like Amazon’s AWS services. On top of that one of the things I’m most excited about are Data Unions. With Data Unions anyone can join the data economy and start monetizing the data they already produce. Streamr’s Data Union framework provides a really easy way for devs to start building their own data unions and can also be easily integrated into any existing apps.
Okay, sounds interesting. Do you have a concrete example you could give us to make it easier to understand?
The best example of a Data Union is the first one that has been built out of our stack. It's called Swash and it's a browser plugin.
You can download it here: http://swashapp.io/
And basically it helps you monetize the data you already generate (day in day out) as you browse the web. It's the sort of data that Google already knows about you. But this way, with Swash, you can actually monetize it yourself. The more people that join the union, the more powerful it becomes and the greater the rewards are for everyone as the data product sells to potential buyers.
Very interesting. What stage is the project/product at? It's live, right?
Yes. It's live. And the Data Union framework is in public beta. The Network is on course to be fully decentralized at some point next year.
How much can a regular person browsing the Internet expect to make for example?
So that's a great question. The answer is no one quite knows yet. We do know that this sort of data (consumer insights) is worth hundreds of millions and really isn't available in high quality. So With a union of a few million people, everyone could be getting 20-50 dollars a year. But it'll take a few years at least to realise that growth. Of course Swash is just one data union amongst many possible others (which are now starting to get built out on our platform!)
With Swash, I believe they now have 3,000 members. They need to get to 50,000 before they become really viable but they are yet to do any marketing. So all that is organic growth.
I assume the data is anonymized btw?
Yes. And there in fact a few privacy protecting tools Swash supplys to its users.
How does Swash compare to Brave?
So Brave really is about consent for people's attention and getting paid for that. They don't sell your data as such.
Swash can of course be a plugin with Brave and therefore you can make passive income browsing the internet. Whilst also then consenting to advertising if you so want to earn BAT.
Of course it's Streamr that is powering Swash. And we're looking at powering other DUs - say for example mobile applications.
The holy grail might be having already existing apps and platforms out there, integrating DU tech into their apps so people can consent (or not) to having their data sold - and then getting a cut of that revenue when it does sell.
The other thing to recognise is that the big tech companies monopolise data on a vast scale - data that we of course produce for them. That is stifling innovation.
Take for example a competitor map app. To effectively compete with Google maps or Waze, they need millions of users feeding real time data into it.
Without that - it's like Google maps used to be - static and a bit useless.
Right, so how do you convince these big tech companies that are producing these big apps to integrate with Streamr? Does it mean they wouldn't be able to monetize data as well on their end if it becomes more available through an aggregation of individuals?
If a map application does manage to scale to that level then inevitably Google buys them out - that's what happened with Waze.
But if you have a data union which bundles together the raw location data of millions of people then any application builder can come along and license that data for their app. This encourages all sorts of innovation and breaks the monopoly.
We're currently having conversations with Mobile Network operators to see if they want to pilot this new approach to data monetization. And that's what even more exciting. Just be explicit with users - do you want to sell your data? Okay, if yes, then which data point do you want to sell.
Then the mobile network operator (like T-mobile for example) then organises the sale of the data of those who consent and everyone gets a cut.
Streamr - in this example provides the backend to port and bundle the data, and also the token and payment rail for the payments.
So for big companies (mobile operators in this case), it's less logistics, handing over the implementation to you, and simply taking a cut?
It's a vision that we'll be able to talk more about more concretely in a few weeks time 😁
Compared to having to make sense of that data themselves (in the past) and selling it themselves
Sort of.
We provide the backened to port the data and the template smart contracts to distribute the payments.
They get to focus on finding buyers for the data and ensuring that the data that is being collected from the app is the kind of data that is valuable and useful to the world.
(Through our sister company TX, we also help build out the applications for them and ensure a smooth integration).
The other thing to add is that the reason why this vision is working, is that the current data economy is under attack. Not just from privacy laws such as GDPR, but also from Google shutting down cookies, bidstream data being investigated by the FTC (for example) and Apple making changes to IoS14 to make third party data sharing more explicit for users.
All this means that the only real places for thousands of multinationals to buy the sort of consumer insights they need to ensure good business decisions will be owned by Google/FB etc, or from SDKs or through this method - from overt, rich, consent from the consumer in return for a cut of the earnings.
A couple of questions to get a better feel about Streamr as a whole now and where it came from. How many people are in the team? For how long have you been working on Streamr?
We are around 35 people with one office in Zug, Switzerland and another one in Helsinki. But there are team members all over the globe, we’ve people in the US, Spain, the UK, Germany, Poland, Australia and Singapore. I joined Streamr back in 2017 during the ICO craze (but not for that reason!)
And did you raise funds so far? If so, how did you handle them? Are you planning to do any future raises?
We did an ICO back in Sept/Oct 2017 in which we raised around 30 Millions CHF. The funds give us enough runway for around five/six years to finalize our roadmap. We’ve also simultaneously opened up a sister company consultancy business, TX which helps enterprise clients implementing the Streamr stack. We've got no more plans to raise more!
What is the token use case? How did you make sure it captures the value of the ecosystem you're building
The token is used for payments on the Marketplace (such as for Data Union products for example) also for the broker nodes in the Network. ( we haven't talked much about the P2P network but it's our project's secret sauce).
The broker nodes will be paid in DATAcoin for providing bandwidth. We are currently working together with Blockscience on our tokeneconomics. We’ve just started the second phase in their consultancy process and will be soon able to share more on the Streamr Network’s tokeneconoimcs.
But if you want to summate the Network in a sentence or two - imagine the Bittorrent network being run by nodes who get paid to do so. Except that instead of passing around static files, it's realtime data streams.
That of course means it's really well suited for the IoT economy.
Well, let's continue with questions from Twitter and this one comes at the perfect time. Can Streamr Network be used to transfer data from IOT devices? Is the network bandwidth sufficient? How is it possible to monetize the received data from a huge number of IOT devices? From u/ EgorCypto
Yes, IoT devices are a perfect use case for the Network. When it comes to the network’s bandwidth and speed - the Streamr team just recently did extensive research to find out how well the network scales.
The result was that it is on par with centralized solutions. We ran experiments with network sizes between 32 to 2048 nodes and in the largest network of 2048 nodes, 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! So we're super happy with those results.
Here's a link to the paper:
https://medium.com/streamrblog/streamr-network-performance-and-scalability-whitepaper-adb461edd002
While we're on the technical side, second question from Twitter: Can you be sure that valuable data is safe and not shared with service providers? Are you using any encryption methods? From u/ CryptoMatvey
Yes, the messages in the Network are encrypted. Currently all nodes are still run by the Streamr team. This will change in the Brubeck release - our last milestone on the roadmap - when end-to-end encryption is added. This release adds end-to-end encryption and automatic key exchange mechanisms, ensuring that node operators can not access any confidential data.
If BTW - you want to get very technical the encryption algorithms we are using are: AES (AES-256-CTR) for encryption of data payloads, RSA (PKCS #1) for securely exchanging the AES keys and ECDSA (secp256k1) for data signing (same as Bitcoin and Ethereum).
Last question from Twitter, less technical now :) In their AMA ad, they say that Streamr has three unions, Swash, Tracey and MyDiem. Why does Tracey help fisherfolk in the Philippines monetize their catch data? Do they only work with this country or do they plan to expand? From u/ alej_pacedo
So yes, Tracey is one of the first Data Unions on top of the Streamr stack. Currently we are working together with the WWF-Philippines and the UnionBank of the Philippines on doing a first pilot with local fishing communities in the Philippines.
WWF is interested in the catch data to protect wildlife and make sure that no overfishing happens. And at the same time the fisherfolk are incentivized to record their catch data by being able to access micro loans from banks, which in turn helps them make their business more profitable.
So far, we have lots of interest from other places in South East Asia which would like to use Tracey, too. In fact TX have already had explicit interest in building out the use cases in other countries and not just for sea-food tracking, but also for many other agricultural products.
(I think they had a call this week about a use case involving cows 😂)
I recall late last year, that the Streamr Data Union framework was launched into private beta, now public beta was recently released. What are the differences? Any added new features? By u/ Idee02
The main difference will be that the DU 2.0 release will be more reliable and also more transparent since the sidechain we are using for micropayments is also now based on blockchain consensus (PoA).
Are there plans in the pipeline for Streamr to focus on the consumer-facing products themselves or will the emphasis be on the further development of the underlying engine?by u/ Andromedamin
We're all about what's under the hood. We want third party devs to take on the challenge of building the consumer facing apps. We know it would be foolish to try and do it all!
As a project how do you consider the progress of the project to fully developed (in % of progress plz) by u/ Hash2T
We're about 60% through I reckon!
What tools does Streamr offer developers so that they can create their own DApps and monetize data?What is Streamr Architecture? How do the Ethereum blockchain and the Streamr network and Streamr Core applications interact? By u/ CryptoDurden
We'll be releasing the Data UNion framework in a few weeks from now and I think DApp builders will be impressed with what they find.
We all know that Blockchain has many disadvantages as well,
So why did Streamr choose blockchain as a combination for its technology?
What's your plan to merge Blockchain with your technologies to make it safer and more convenient for your users? By u/ noonecanstopme
So we're not a blockchain ourselves - that's important to note. The P2P network only uses BC tech for the payments. Why on earth for example would you want to store every single piece of info on a blockchain. You should only store what you want to store. And that should probably happen off chain.
So we think we got the mix right there.
What were the requirements needed for node setup ? by u/ John097
Good q - we're still working on that but those specs will be out in the next release.
How does the STREAMR team ensure good data is entered into the blockchain by participants? By u/ kartika84
Another great Q there! From the product buying end, this will be done by reputation. But ensuring the quality of the data as it passes through the network - if that is what you also mean - is all about getting the architecture right. In a decentralised network, that's not easy as data points in streams have to arrive in the right order. It's one of the biggest challenges but we think we're solving it in a really decentralised way.
What are the requirements for integrating applications with Data Union? What role does the DATA token play in this case? By u/ JP_Morgan_Chase
There are no specific requirements as such, just that your application needs to generate some kind of real-time data. Data Union members and administrators are both paid in DATA by data buyers coming from the Streamr marketplace.
Regarding security and legality, how does STREAMR guarantee that the data uploaded by a given user belongs to him and he can monetize and capitalize on it? By u/ kherrera22
So that's a sort of million dollar question for anyone involved in a digital industry. Within our system there are ways of ensuring that but in the end the negotiation of data licensing will still, in many ways be done human to human and via legal licenses rather than smart contracts. at least when it comes to sizeable data products. There are more answers to this but it's a long one!
Okay thank you all for all of those!
The AMA took place in the GAINS Telegram group 10/09/20. Answers by Shiv Malik.
submitted by thamilton5 to streamr [link] [comments]

6 Reasons Why Serum Won't Succeed

6 Reasons Why Serum Won't Succeed

The world of DeFi is exploding but is it all it’s made out to be?

DeFi (decentralised finance) is most certainly the buzz in the crypto world this minute. It’s bringing similar feelings which was the 2017/18 ICO phase, where a mammoth of new projects begun to explode onto the scene, each with their own promise of new innovation and use case.
Hindsight has shown us that most of those projects have ultimately failed, or worse, were outright scams that took advantage of not so wise investors looking to make a buck. Obviously, not all projects fit that description, with many teams still around today working on and delivering their individual visions. Crypto is, after all, still a big experiment of new technology.

Enter DeFi: Serum

DeFi has exploded into the limelight over the last few months, with some tokens appreciating hundreds of percent in price. It appears to be the catalyst that has driven a huge market shift in the crypto world, and for those who’ve been around a number of years, this is a welcome change.
In this piece, I’m going to examine a particular project called Serum.
Serum is the world’s first completely decentralized derivatives exchange with trustless cross-chain trading brought to you by Project Serum.
The Serum Project is aiming to create both a decentralised exchange and a cross-chain swapping mechanism. In this article, I’m going to focus solely on the cross-chain swapping aspect of Serum.
Although the Serum whitepaper is quite short and lacking in detail, it is useful to derive some understanding of how the cross-chain swapping protocol should work. Throughout this review, I will use it to describe how the imagined protocol works.

Overview

Let's assume Alice wants to trade some BTC for ETH and Bob wants to trade some ETH for BTC using Serum. These two users are matched and agree on a price using an on-chain order book on the Solana blockchain (whitepaper provides no practical details on how to do this).
Once these users are matched, Bob must send the ETH he wants to trade to an Ethereum smart contract, plus some amount of ETH ~200 USD worth (see section 4 below) to the smart contract as collateral. Alice will also need to send some collateral to the smart contract. Once this initial setup process is complete Alice then has to send her BTC to Bob’s BTC address and if Bob receives the BTC from Alice he can then release his ETH from the smart contract sending it to Alice’s ETH address. Upon completion of this both Alice and Bob are refunded their ETH collateral.
So what happens if something goes wrong? For example, say Alice never sends BTC to Bob, after some period of time Bob can initiate a dispute. When the dispute begins both Alice and Bob present a portion of the Bitcoin blockchain information to the smart contract (see section 3). The smart contract then decides whether or not Alice did send BTC to Bob. If she hasn’t then the smart contract returns Bob's ETH and collateral to Bob and also takes Alice’s ETH collateral and gives that to Bob. The same occurs in reverse if Alice sends BTC but Bob never approves the transfer of ETH from the smart contract.
This scheme seems pretty simple, there’s no oracles and no centralised parties, however, it has a number of disadvantages.

1. User-Provided Collateral Is Bad for User Experience

Each time a user conducts a swap they must reserve some percentage or fixed amount to cover the collateral for the swap. This collateral amount needs to be present to prevent griefing attacks where users initiate swaps with no intention of ever following through and sending funds to the alternate participant.
However, this creates a poor user experience as both Alice and Bob need to have at least the value of the dispute fee committed to the contract in collateral before they conduct a swap. This is totally foreign from the normal exchange experience in which you only require a single coin and a single transaction to begin trading. For example, if using Serum to trade Bitcoin you would need to hold Bitcoin and ~200$ of Ethereum and also interact with the Ethereum chain before any swap occurs. This adds unnecessary complexity and confusion, especially for newcomers to the crypto space.

2. ETH Must Always Be on One Side of the Swap

Although the Serum method of cross-chain swapping could occur on any blockchain with smart contracts, the Serum whitepaper makes it clear the Serum arbitration contract is going to be deployed on the Ethereum blockchain. This means one party must always be locking the full value of the trade in ETH using an Ethereum smart contract.
This makes it impossible, for example, to do a single step trade between Bitcoin and Monero since the swap would need to be from Bitcoin to ETH first and then from ETH to Monero. This is comparable to other proposed cross-chain swap systems like Thorchain and Blockswap, however since those networks use AMM’s (automated market makers)and decentralized vaults to take custody of funds, the user needs not to interact with the intermediary chain at all.
Instead in Serum, the user wanting to swap Bitcoin to Monero will need to do the following steps:
  1. Send Ethereum collateral to the Serum arbitration contract
  2. Send Bitcoin to the user they are swapping with.
  3. Receive Ethereum
  4. Send Ethereum back to Serum arbitration contract
  5. Receive Monero
  6. Send Ethereum out of Serum arbitration contract
  7. Receive back Ethereum collateral
It might be possible to remove or simplify step 4, depending on how the smart contract is built, however, this means a swap from BTC to Monero would require 2 Ethereum and 1 Bitcoin transaction in the best-case scenario. Compared with the experience of other cross-chain swapping mechanisms, which only require the user to send a single transaction to swap between two assets, this is very poor user experience.

3. Proving Transactions on Arbitrary Chains to a Smart Contract Is Not Trivial

Perhaps the most central part of the Serum cross-chain swapping mechanism is left completely unexplored in the Serum whitepaper with only a brief explanation given.
“[The] Smart Contract is programmed to parse whether a proposed BTC blockchain is valid; it can then check which of Alice and Bob send the longer valid blockchain, and settle in their favor”
This is not a trivial problem, and it is unclear how this actually works from the explanation given in the Serum whitepaper. What actually needs to be presented to the smart contract to prove a Bitcoin transaction? Typically when talking about SPV the smart contract would need the block headers of all previous blocks and a merkle inclusion proof. This is far too heavy to submit in a dispute. Instead, Serum could use NIPoPoW, however, these proofs only work on chains with fixed difficulty and are still probably prohibitively too large (~100KB) to be submitted as a proof to a contract. Other solutions like Flyclient are more versatile, but proof sizes are much larger and have failed to see much real-world adoption.
Without explaining how they actually plan to do this validation of Bitcoin transactions, users are left in the dark about how secure their solution actually is.

4. High Dispute Fees Force Large Collateral on Small Trades

Although disputes should almost never happen because of the incentives and punishments designed into the Serum protocol, the way they are designed has negative impacts on the use of the network.
Although the Serum whitepaper does not say how the dispute mechanism works, they do say that it will cost about ~100 USD in GAS to dispute a swap.
Note: keep in mind that the Serum paper was published in July 2020 when the gas price was about 50 Gwei, as Ethereum use has picked up over the past month we have seen average GAS prices as high as 250 Gwei, with the average price right now about 120 Gwei.
This means that at the height of GAS prices it could have cost a user ~500 USD to dispute a swap.
This means for the network to ensure losing cross-chain swaps aren’t made each user must deploy at least $200 in collateral on each side. It may be possible to lower this to collateral if we assume the attacker is not financially motivated, however, there is a lower bound in which ransom attacks become possible on low-value trades.
Further and perhaps more damagingly, this means in a trade of any size the user needs to have at least 300 USD in ETH laying around. 100 USD in ETH for the required collateral and 200 USD if they need to challenge the transaction.
This further adds to the poor user experience when using Serum for cross-chain swapping.

5. Swaps Are Not Set and Forget

Instead of being able to send a transaction and receive funds on the blockchain you are swapping to, the process is highly interactive. In the case where I am swapping ETH for Bitcoin, the following occurs:
If the Bitcoin transaction is never received then I need to wait for a timeout to occur before I can participate in the dispute process.
And on the Bitcoin side (assuming the seller is ready), the following must take place:
If the Seller never accepts the Bitcoin I sent to him then I need to wait on line for the dispute process.
This presents a strange user experience where the seller or seller’s wallet must be left online during this whole process and be ready to sign a new transaction if they need to dispute transactions or unlock funds from a smart contract.
This is different from the typical exchange or swapping scenario in which, once your funds are sent you can be assured you will receive the amount you expected in your swap back to you, without any of your wallets needing to remain online.

6. The Serum Token Seems to Lack a Use Case

The cross-chain swapping protocol Serum describes in its whitepaper could easily be forked and launched on the Ethereum blockchain without having any need for the Serum token. It seems that the Serum token will be used in some capacity when placing orders on the Solana based blockchain, however, the order book could just as easily be placed off with traditional rate-limiting schemes.
There is some brief mention of future governance abilities for token holders, however, as a common theme in their whitepaper, details are scarce:
Serum is anticipated to include a limited governance model based on the SRM token. While most of the Serum ecosystem will be immutable, some parameters without large security risks (e.g. future fees) may be modified via a governance vote of SRM tokens.

Conclusion

Until satisfactory answers are given to these questions I would be looking at other projects who are attempting to build platforms for cross-chain swaps. As previously mentioned, Thorchain & Blockswap show some promise in design, whilst there are some others competing in this space too, such as Incognito and RenVM. However, this area is still extremely immature so plenty of testing and time is required before we can call any of these projects a success.
If you’ve got any feedback or thoughts about Serum, cross-chain swapping or DeFi in general, please don’t be shy in leaving a comment.
submitted by Loooong_Loooong_Man to CryptoCurrency [link] [comments]

Building an Ethereum Mining Rig - Part 2

First update to the guide "Building a 6Gpu Mining Rig for Ethereum" - Let's talk about Claymore.
This update supplements and does not replace the Guide to Build a 6GPU Mining Rig for Ethereum published on our site.

The substantial differences are due to the installation of the latest version of the Windows 10 Operating System, the mining on the Ethermine pool (in our opinion simpler than Dwarfpool) and the use of the XFX RX 580 8gb GPUs.

The first variant is found in Part 4 of the guide: the environment variables are not to be entered as they will be integrated directly into the bat file to start mining.

The second variant is found in Part 7 of the guide and leads us to "mine" on a different pool using the Claymore software.
Download the latest version at the following link: https://github.com/Claymore-Dual/Claymore-Dual-Miner

Once downloaded, unzip everything on a folder on your desktop and open the start.bat file with notepad. Clear the contents and copy the following command:

start config.dll -epool eu1.ethermine.org:14444 -ewal "your ETH wallet address" -epsw x -worker "worker"
EthDcrMiner64.exe

Where instead of "your wallet address" you will have to put your Ethererum wallet - obviously without the quotes - and instead of worker you will put an identification number in case you build more RIGs (such as RIG1, RIG2, etc ...). We opted for the eu1 pool even if some on the discussion forums believe that the us1 is more profitable.

At the following link, many other useful commands for your Rig:
https://github.com/Claymore-Dual/Claymore-Dual-Miner


The Ethermine pool offers a very well crafted and descriptive interface. In the Payouts section, after only 5 minutes of mining, you can decide the minimum amount of Ether to be transferred to your wallet by simply entering the IP address of the RIG.

We have decided to mine directly on the Ethereum address of our Exodus wallet. It is not recommended to mine directly on Coinbase, as reported on the site itself. Sin.

Nothing should be left to chance when you decide to build a mining rig for Ethereum.
The third variant is the most difficult of all. Once you have reached Part 5 of the guide, you can decide whether to continue or follow this update / variant. If you are here it is probably because you have run into some problem that the guide does not allow you to solve.

With the latest version of Windows 10, you may run into a kernel conflict between the operating system and AMD's Radeon Software Crimson ReLive Edition Beta for Blockchain Compute drivers. This conflict will prevent you from using Atiflash after installing the drivers.

Important: Before making any changes to the BIOS, please backup each GPU.

Important: first of all flash the GPUs with the original bios if for any reason you are forced to reinstall the operating system.


Still on Atiflash.
The advice would therefore be to flash the GPUs and then install the AMD drivers. Let's say it would be because you may run into another problem this time related to the GPUs themselves. Since each video card is different from any other, the bios mod of the GPU could crash the operating system showing the classic blue screen and displaying an error related to the Atimkdag.sys file.

This could be due to the fact that some GPUs have significantly higher performance in the calculation phase than others. We could call it a factory overclock but not using them for gaming we cannot say it with absolute certainty.

Having assessed these two drawbacks, the only safe solution is to flash all the GPUs, disconnect them except for the first one, install the Blockchain drivers (plus Atimkdag patch) and launch the mining command verifying that the operating system does not go into crash in the next 5 minutes.

Turn off the rig again and connect the second GPU so on up to the sixth. In the event that one or more video cards should crash the system, disconnect them. After that, it uses DDU from the provisional mode and flashes these GPUs with their original bios. At this point, connect them again, reinstall the Blockcain drivers (plus Atimkdag patch) and start mining definitively.

All the operations related to the use of Atiflash, DDU and driver installation are reported in Part 5 and Part 6 of our guide.
A little bit of Overclocking.
You will certainly find significant differences in performance between the GPUs.

At this point all that remains is to "operate" with an overclocking software. We opt for OverdriveNtool. Our constantly updated guide is available at the following link: https://www.cryptoall.it/2019/10/12/complete-guide-to-overdriventool/

Link to the official YouTube channel for verification: https://www.youtube.com/channel/UCdE9TTHAOtyKxy59rALSprA

GPUs with modified bios will not leave much room for modification. You will have to proceed with the most extreme overclocking on those that mount the original bios; obviously always in small steps by saving the profile for each GPU. Our guide explains in detail how to do it.

Hoping to have been of help, we give everyone an appointment for the second part of the update on how to build an Ethereum mining rig in which we will explain in detail the dual mining on the Ethermine pool.

See you soon.


If you liked this article and would like to contribute with a donation:

Bitcoin: 1Ld9b165ZYHZcY9eUQmL9UjwzcphRE5S8Z
Ethereum: 0x8D7E456A11f4D9bB9e6683A5ac52e7DB79DBbEE7
Litecoin: LamSRc1jmwgx5xwDgzZNoXYd6ENczUZViK
Stellar: GBLDIRIQWRZCN5IXPIKYFQOE46OG2SI7AFVWFSLAHK52MVYDGVJ6IXGI
Ripple: rUb8v4wbGWYrtXzUpj7TxCFfUWgfvym9xf

By: cryptoall.it Telegram Channel: t.me/giulo75 Netbox Browser: https://netbox.global/PZn5A
submitted by Giulo75 to u/Giulo75 [link] [comments]

Polkadot Launch AMA Recap

Polkadot Launch AMA Recap

The Polkadot Telegram AMA below took place on June 10, 2020

https://preview.redd.it/4ti681okap951.png?width=4920&format=png&auto=webp&s=e21f6a9a276d35bb9cdec59f46744f23c37966ef
AMA featured:
Dieter Fishbein, Ecosystem Development Lead, Web3 Foundation
Logan Saether, Technical Education, Web3 Foundation
Will Pankiewicz, Master of Validators, Parity Technologies
Moderated by Dan Reecer, Community and Growth, Polkadot & Kusama at Web3 Foundation

Transcription compiled by Theresa Boettger, Polkadot Ambassador:

Dieter Fishbein, Ecosystem Development Lead, Web3 Foundation

Dan: Hey everyone, thanks for joining us for the Polkadot Launch AMA. We have Dieter Fishbein (Head of Ecosystem Development, our business development team), Logan Saether (Technical Education), and Will Pankiewicz (Master of Validators) joining us today.
We had some great questions submitted in advance, and we’ll start by answering those and learning a bit about each of our guests. After we go through the pre-submitted questions, then we’ll open up the chat to live Q&A and the hosts will answer as many questions as they can.
We’ll start off with Dieter and ask him a set of some business-related questions.

Dieter could you introduce yourself, your background, and your role within the Polkadot ecosystem?

Dieter: I got my start in the space as a cryptography researcher at the University of Waterloo. This is where I first learned about Bitcoin and started following the space. I spent the next four years or so on the investment team for a large asset manager where I primarily focused on emerging markets. In 2017 I decided to take the plunge and join the space full-time. I worked at a small blockchain-focused VC fund and then joined the Polkadot team just over a year ago. My role at Polkadot is mainly focused on ensuring there is a vibrant community of projects building on our technology.

Q: Adoption of Polkadot of the important factors that all projects need to focus on to become more attractive to the industry. So, what is Polkadot's plan to gain more Adoption? [sic]

A (Dieter): Polkadot is fundamentally a developer-focused product so much of our adoption strategy is focused around making Polkadot an attractive product for developers. This has many elements. Right now the path for most developers to build on Polkadot is by creating a blockchain using the Substrate framework which they will later connect to Polkadot when parachains are enabled. This means that much of our adoption strategy comes down to making Substrate an attractive tool and framework. However, it’s not just enough to make building on Substrate attractive, we must also provide an incentive to these developers to actually connect their Substrate-based chain to Polkadot. Part of this incentive is the security that the Polkadot relay chain provides but another key incentive is becoming interoperable with a rich ecosystem of other projects that connect to Polkadot. This means that a key part of our adoption strategy is outreach focused. We go out there and try to convince the best projects in the space that building on our technology will provide them with significant value-add. This is not a purely technical argument. We provide significant support to projects building in our ecosystem through grants, technical support, incubatoaccelerator programs and other structured support programs such as the Substrate Builders Program (https://www.substrate.io/builders-program). I do think we really stand out in the significant, continued support that we provide to builders in our ecosystem. You can also take a look at the over 100 Grants that we’ve given from the Web3 Foundation: https://medium.com/web3foundation/web3-foundation-grants-program-reaches-100-projects-milestone-8fd2a775fd6b

Q: On moving forward through your roadmap, what are your most important next priorities? Does the Polkadot team have enough fundamentals (Funds, Community, etc.) to achieve those milestones?

A (Dieter): I would say the top priority by far is to ensure a smooth roll-out of key Polkadot features such as parachains, XCMP and other key parts of the protocol. Our recent Proof of Authority network launch was only just the beginning, it’s crucial that we carefully and successfully deploy features that allow builders to build meaningful technology. Second to that, we want to promote adoption by making more teams aware of Polkadot and how they can leverage it to build their product. Part of this comes down to the outreach that I discussed before but a major part of it is much more community-driven and many members of the team focus on this.
We are also blessed to have an awesome community to make this process easier 🙂

Q: Where can a list of Polkadot's application-specific chains can be found?

A (Dieter): The best list right now is http://www.polkaproject.com/. This is a community-led effort and the team behind it has done a terrific job. We’re also working on providing our own resource for this and we’ll share that with the community when it’s ready.

Q: Could you explain the differences and similarities between Kusama and Polkadot?

A (Dieter): Kusama is fundamentally a less robust, faster-moving version of Polkadot with less economic backing by validators. It is less robust since we will be deploying new technology to Kusama before Polkadot so it may break more frequently. It has less economic backing than Polkadot, so a network takeover is easier on Kusama than on Polkadot, lending itself more to use cases without the need for bank-like security.
In exchange for lower security and robustness, we expect the cost of a parachain lease to be lower on Kusama than Polkadot. Polkadot will always be 100% focused on security and robustness and I expect that applications that deal with high-value transactions such as those in the DeFi space will always want a Polkadot deployment, I think there will be a market for applications that are willing to trade cheap, high throughput for lower security and robustness such as those in the gaming, content distribution or social networking sectors. Check out - https://polkadot.network/kusama-polkadot-comparing-the-cousins/ for more detailed info!

Q: and for what reasons would a developer choose one over the other?

A (Dieter): Firstly, I see some earlier stage teams who are still iterating on their technology choosing to deploy to Kusama exclusively because of its lower-stakes, faster moving environment where it will be easier for them to iterate on their technology and build their user base. These will likely encompass the above sectors I identified earlier. To these teams, Polkadot becomes an eventual upgrade path for them if, and when, they are able to perfect their product, build a larger community of users and start to need the increased stability and security that Polkadot will provide.
Secondly, I suspect many teams who have their main deployment on Polkadot will also have an additional deployment on Kusama to allow them to test new features, either their tech or changes to the network, before these are deployed to Polkadot mainnet.

Logan Saether, Technical Education, Web3 Foundation

Q: Sweet, let's move over to Logan. Logan - could you introduce yourself, your background, and your role within the Polkadot ecosystem?

A (Logan): My initial involvement in the industry was as a smart contract engineer. During this time I worked on a few projects, including a reboot of the Ethereum Alarm Clock project originally by Piper Merriam. However, I had some frustrations at the time with the limitations of the EVM environment and began to look at other tools which could help me build the projects that I envisioned. This led to me looking at Substrate and completing a bounty for Web3 Foundation, after which I applied and joined the Technical Education team. My responsibilities at the Technical Education team include maintaining the Polkadot Wiki as a source of truth on the Polkadot ecosystem, creating example applications, writing technical documentation, giving talks and workshops, as well as helping initiatives such as the Thousand Validator Programme.

Q: The first technical question submitted for you was: "When will an official Polkadot mobile wallet appear?"

A (Logan): There is already an “official” wallet from Parity Technologies called the Parity Signer. Parity Signer allows you to keep your private keys on an air-gapped mobile device and to interactively sign messages using web interfaces such as Polkadot JS Apps. If you’re looking for something that is more of an interface to the blockchain as well as a wallet, you might be interested in PolkaWallet which is a community team that is building a full mobile interface for Polkadot.
For more information on Parity Signer check out the website: https://www.parity.io/signe

Q: Great thanks...our next question is: If someone already developed an application to run on Ethereum, but wants the interoperability that Polkadot will offer, are there any advantages to rebuilding with Substrate to run as a parachain on the Polkadot network instead of just keeping it on Ethereum and using the Ethereum bridge for use with Polkadot?

A (Logan): Yes, the advantage you would get from building on Substrate is more control over how your application will interact with the greater Polkadot ecosystem, as well as a larger design canvas for future iterations of your application.
Using an Ethereum bridge will probably have more cross chain latency than using a Polkadot parachain directly. The reason for this is due to the nature of Ethereum’s separate consensus protocol from Polkadot. For parachains, messages can be sent to be included in the next block with guarantees that they will be delivered. On bridged chains, your application will need to go through more routes in order to execute on the desired destination. It must first route from your application on Ethereum to the Ethereum bridge parachain, and afterward dispatch the XCMP message from the Polkadot side of the parachain. In other words, an application on Ethereum would first need to cross the bridge then send a message, while an application as a parachain would only need to send the message without needing to route across an external bridge.

Q: DOT transfers won't go live until Web3 removes the Sudo module and token holders approve the proposal to unlock them. But when will staking rewards start to be distributed? Will it have to after token transfers unlock? Or will accounts be able to accumulate rewards (still locked) once the network transitions to NPoS?

A (Logan): Staking rewards will be distributed starting with the transition to NPoS. Transfers will still be locked during the beginning of this phase, but reward payments are technically different from the normal transfer mechanism. You can read more about the launch process and steps at http://polkadot.network/launch-roadmap

Q: Next question is: I'm interested in how Cumulus/parachain development is going. ETA for when we will see the first parachain registered working on Kusama or some other public testnet like Westend maybe?

A (Logan): Parachains and Cumulus is a current high priority development objective of the Parity team. There have already been PoC parachains running with Cumulus on local testnets for months. The current work now is making the availability and validity subprotocols production ready in the Polkadot client. The best way to stay up to date would be to follow the project boards on GitHub that have delineated all of the tasks that should be done. Ideally, we can start seeing parachains on Westend soon with the first real parachains being deployed on Kusama thereafter.
The projects board can be viewed here: https://github.com/paritytech/polkadot/projects
Dan: Also...check out Basti's tweet from yesterday on the Cumulus topic: https://twitter.com/bkchstatus/1270479898696695808?s=20

Q: In what ways does Polkadot support smart contracts?

A (Logan): The philosophy behind the Polkadot Relay Chain is to be as minimal as possible, but allow arbitrary logic at the edges in the parachains. For this reason, Polkadot does not support smart contracts natively on the Relay Chain. However, it will support smart contracts on parachains. There are already a couple major initiatives out there. One initiative is to allow EVM contracts to be deployed on parachains, this includes the Substrate EVM module, Parity’s Frontier, and projects such as Moonbeam. Another initiative is to create a completely new smart contract stack that is native to Substrate. This includes the Substrate Contracts pallet, and the ink! DSL for writing smart contracts.
Learn more about Substrate's compatibility layer with Ethereum smart contracts here: https://github.com/paritytech/frontier

Will Pankiewicz, Master of Validators, Parity Technologies


Q: (Dan) Thanks for all the answers. Now we’ll start going through some staking questions with Will related to validating and nominating on Polkadot. Will - could you introduce yourself, your background, and your role within the Polkadot ecosystem?

A (Will): Sure thing. Like many others, Bitcoin drew me in back in 2013, but it wasn't until Ethereum came that I took the deep dive into working in the space full time. It was the financial infrastructure aspects of cryptocurrencies I was initially interested in, and first worked on dexes, algorithmic trading, and crypto funds. I really liked the idea of "Generalized Mining" that CoinFund came up with, and started to explore the whacky ways the crypto funds and others can both support ecosystems and be self-sustaining at the same time. This drew me to a lot of interesting experiments in what later became DeFi, as well as running validators on Proof of Stake networks. My role in the Polkadot ecosystem as “Master of Validators” is ensuring the needs of our validator community get met.

Q: Cool thanks. Our first community question was "Is it still more profitable to nominate the validators with lesser stake?"

A (Will): It depends on their commission, but generally yes it is more profitable to nominate validators with lesser stake. When validators have lesser stake, when you nominate them this makes your nomination stake a higher percentage of total stake. This means when rewards get distributed, it will be split more favorably toward you, as rewards are split by total stake percentage. Our entire rewards scheme is that every era (6 hours in Kusama, 24 hours in Polkadot), a certain amount of rewards get distributed, where that amount of rewards is dependent on the total amount of tokens staked for the entire network (50% of all tokens staked is currently optimal). These rewards from the end of an era get distributed roughly equally to all validators active in the validator set. The reward given to each validator is then split between the validators and all their nominators, determined by the total stake that each entity contributes. So if you contribute to a higher percentage of the total stake, you will earn more rewards.

Q: What does priority ranking under nominator addresses mean? For example, what does it mean that nominator A has priority 1 and nominator B has priority 6?

A (Will): Priority ranking is just the index of the nomination that gets stored on chain. It has no effect on how stake gets distributed in Phragmen or how rewards get calculated. This is only the order that the nominator chose their validators. The way that stake from a nominator gets distributed from a nominator to validators is via Phragmen, which is an algorithm that will optimally put stake behind validators so that distribution is roughly equal to those that will get in the validator set. It will try to maximize the total amount at stake in the network and maximize the stake behind minimally staked validators.

Q: On Polkadot.js, what does it mean when there are nodes waiting on Polkadot?

**A (Will):**In Polkadot there is a fixed validator set size that is determined by governance. The way validators get in the active set is by having the highest amount of total stake relative to other validators. So if the validator set size is 100, the top 100 validators by total stake will be in the validator set. Those not active in the validator set will be considered “waiting”.

Q: Another question...Is it necessary to become a waiting validator node right now?

A (Will): It's not necessary, but highly encouraged if you actively want to validate on Polkadot. The longer you are in the waiting tab, the longer you get exposure to nominators that may nominate you.

Q: Will current validators for Kusama also validate for Polkadot? How strongly should I consider their history (with Kusama) when looking to nominate a good validator for DOTs?

A (Will): A lot of Kusama validators will also be validators for Polkadot, as KSM was initially distributed to DOT holders. The early Kusama Validators will also likely be the first Polkadot validators. Being a Kusama validator should be a strong indicator for who to nominate on Polkadot, as the chaos that has ensued with Kusama has allowed validators to battle test their infrastructure. Kusama validators by now are very familiar with tooling, block explorers, terminology, common errors, log formats, upgrades, backups, and other aspects of node operation. This gives them an edge against Polkadot validators that may be new to the ecosystem. You should strongly consider well known Kusama validators when making your choices as a nominator on Polkadot.

Q: Can you go into more details about the process for becoming a DOT validator? Is it similar as the KSM 1000 validators program?

A (Will): The Process for becoming a DOT validators is first to have DOTs. You cannot be a validator without DOTs, as DOTs are used to pay transaction fees, and the minimum amount of DOTs you need is enough to create a validate transaction. After obtaining enough DOTs, you will need to set up your validator infrastructure. Ideally you should have a validator node with specs that match what we call standard hardware, as well as one or more sentry nodes to help isolate the validator node from attacks. After the infrastructure is up and running, you should have your Polkadot accounts set up right with a stash bonded to a controller account, and then submit a validate transaction, which will tell the network your nodes are ready to be a part of the network. You should then try and build a community around your validator to let others know you are trustworthy so that they will nominate you. The 1000 validators programme for Kusama is a programme that gives a certain amount of nominations from the Web3 Foundation and Parity to help bootstrap a community and reputation for validators. There may eventually be a similar type of programme for Polkadot as well.
Dan: Thanks a lot for all the answers, Will. That’s the end of the pre-submitted questions and now we’ll open the chat up to live Q&A, and our three team members will get through as many of your questions as possible.
We will take questions related to business development, technology, validating, and staking. For those wondering about DOT:
DOT tokens do not exist yet. Allocations of Polkadot's native DOT token are technically and legally non-transferable. Hence any publicized sale of DOTs is unsanctioned by Web3 Foundation and possibly fraudulent. Any official public sale of DOTs will be announced on the Web3 Foundation website. Polkadot’s launch process started in May and full network decentralization later this year, holders of DOT allocations will determine issuance and transferability. For those who participated in previous DOT sales, you can learn how to claim your DOTs here (https://wiki.polkadot.network/docs/en/claims).


Telegram Community Follow-up Questions Addressed Below


Q: Polkadot looks good but it confuses me that there are so many other Blockchain projects. What should I pay attention in Polkadot to give it the importance it deserves? What are your planning to achieve with your project?

A (Will): Personally, what I think differentiates it is the governance process. Coordinating forkless upgrades and social coordination helps stand it apart.
A (Dieter): The wiki is awesome - https://wiki.polkadot.network/

Q: Over 10,000 ETH paid as a transaction fee , what if this happens on Polkadot? Is it possible we can go through governance to return it to the owner?

A: Anything is possible with governance including transaction reversals, if a network quorum is reached on a topic.
A (Logan): Polkadot transaction fees work differently than the fees on Ethereum so it's a bit more difficult to shoot yourself in the foot as the whale who sent this unfortunate transaction. See here for details on fees: https://w3f-research.readthedocs.io/en/latest/polkadot/Token%20Economics.html?highlight=transaction%20fees#relay-chain-transaction-fees-and-per-block-transaction-limits
However, there is a tip that the user can input themselves which they could accidentally set to a large amount. In this cases, yes, they could proposition governance to reduce the amount that was paid in the tip.

Q: What is the minimum ideal amount of DOT and KSM to have if you want to become a validator and how much technical knowledge do you need aside from following the docs?

A (Will): It depends on what the other validators in the ecosystem are staking as well as the validator set size. You just need to be in the top staking amount of the validator set size. So if its 100 validators, you need to be in the top 100 validators by stake.

Q: Will Web3 nominate validators? If yes, which criteria to be elected?

A (Will): Web 3 Foundation is running programs like the 1000 validators programme for Kusama. There's a possibility this will continue on for Polkadot as well after transfers are enabled. https://thousand-validators.kusama.network/#/
You will need to be an active validator to earn rewards. Only those active in the validator set earn rewards. I would recommend checking out parts of the wiki: https://wiki.polkadot.network/docs/en/maintain-guides-validator-payout

Q: Is it possible to implement hastables or dag with substrate?

A (Logan): Yes.

Q: Polkadot project looks very futuristic! But, could you tell us the main role of DOT Tokens in the Polkadot Ecosystem?

A (Dan): That's a good question. The short answer is Staking, Governance, Bonding. More here: http://polkadot.network/dot-token

Q: How did you manage to prove that the consensus protocol is safe and unbreakable mathematically?

A (Dieter): We have a research teams of over a dozen scientists with PhDs and post-docs in cryptography and distributed computing who do thorough theoretical analyses on all the protocols used in Polkadot

Q: What are the prospects for NFT?

A: Already being built 🙂

Q: What will be Polkadot next roadmap for 2020 ?

A (Dieter): Building. But seriously - we will continue to add many more features and upgrades to Polkadot as well as continue to strongly focus on adoption from other builders in the ecosystem 🙂
A (Will): https://polkadot.network/launch-roadmap/
This is the launch roadmap. Ideally adding parachains and xcmp towards the end of the year

Q: How Do you stay active in terms of marketing developments during this PANDEMIC? Because I'm sure you're very excited to promote more after this settles down.

A (Dan): The main impact of covid was the impact on in-person events. We have been very active on Crowdcast for webinars since 2019, so it was quite the smooth transition to all-online events. You can see our 40+ past event recordings and follow us on Crowdcast here: https://www.crowdcast.io/polkadot. If you're interested in following our emails for updates (including online events), subscribe here: https://info.polkadot.network/subscribe

Q: Hi, who do you think is your biggest competitor in the space?

A (Dan): Polkadot is a metaprotocol that hasn't been seen in the industry up until this point. We hope to elevate the industry by providing interoperability between all major public networks as well as private blockchains.

Q: Is Polkadot a friend or competitor of Ethereum?

A: Polkadot aims to elevate the whole blockchain space with serious advancements in interoperability, governance and beyond :)

Q: When will there be hardware wallet support?

A (Will): Parity Signer works well for now. Other hardware wallets will be added pretty soon

Q: What are the attractive feature of DOT project that can attract any new users ?

A: https://polkadot.network/what-is-polkadot-a-brief-introduction/
A (Will): Buidling parachains with cross chain messaging + bridges to other chains I think will be a very appealing feature for developers

Q: According to you how much time will it take for Polkadot to get into mainstream adoption and execute all the plans set for this project?

A: We are solving many problems that have held back the blockchain industry up until now. Here is a summary in basic terms:
https://preview.redd.it/ls7i0bpm8p951.png?width=752&format=png&auto=webp&s=a8eb7bf26eac964f6b9056aa91924685ff359536

Q: When will bitpie or imtoken support DOT?

A: We are working on integrations on all the biggest and best wallet providers. ;)

Q: What event/call can we track to catch a switch to nPOS? Is it only force_new_era call? Thanks.

A (Will): If you're on riot, useful channels to follow for updates like this are #polkabot:matrix.org and #polkadot-announcements:matrix.parity.io
A (Logan): Yes this is the trigger for initiating the switch to NPoS. You can also poll the ForceEra storage for when it changes to ForceNew.

Q: What strategy will the Polkadot Team use to make new users trust its platform and be part of it?

A (Will): Pushing bleeding edge cryptography from web 3 foundation research
A (Dan): https://t.me/PolkadotOfficial/43378

Q: What technology stands behind and What are its advantages?

A (Dieter): Check out https://polkadot.network/technology/ for more info on our tech stack!

Q: What problems do you see occurring in the blockchain industry nowadays and how does your project aims to solve these problems?

A (Will): Governance I see as a huge problem. For example upgrading Bitcoin and making decisions for changing things is a very challenging process. We have robust systems of on-chain governance to help solve these coordination problems

Q: How involved are the Polkadot partners? Are they helping with the development?

A (Dieter): There are a variety of groups building in the Polkadot ecosystem. Check out http://www.polkaproject.com/ for a great list.

Q: Can you explain the role of the treasury in Polkadot?

A (Will): The treasury is for projects or people that want to build things, but don't want to go through the formal legal process of raising funds from VCs or grants or what have you. You can get paid by the community to build projects for the community.
A: There’s a whole section on the wiki about the treasury and how it functions here https://wiki.polkadot.network/docs/en/mirror-learn-treasury#docsNav

Q: Any plan to introduce Polkadot on Asia, or rising market on Asia?

**A (Will):**We're globally focused

Q: What kind of impact do you expect from the Council? Although it would be elected by token holders, what kind of people you wish to see there?

A (Will): Community focused individuals like u/jam10o that want to see cool things get built and cool communities form

If you have further questions, please ask in the official Polkadot Telegram channel.
submitted by dzr9127 to dot [link] [comments]

Get Started with The Crypto Genius

Get Started with The Crypto Genius
nce a person starts using the Ethereum Code, he/she does not have to employ any other technical indicator or strategies of their own since the application takes care of the rest. All that one has to do is routinely monitor the performance of the bot and see that the trading parameters that had been set beforehand are left as they are.ditionally, the video also carries a message from Marc Weston, the CEO of Ethereum Code, in which he speaks about the core idea underlying the product. In his own words, the idea to create Ethereum Code came to him after he started talking to one of his office colleagues who had been able to amass quite a massive fortune by investing in Bitcoin at the turn of the last decade.

https://preview.redd.it/ash1nbth76f51.jpg?width=1024&format=pjpg&auto=webp&s=bc6d86c15250b65bb721f1ce136cd461a302bd0b
Let's review Ethereum Code bot and see if the automatic cryptocurrency trading platform is a scam or it is a legit software program that can truly produce real profits.
ditionally, the video also carries a message from Marc Weston, the CEO of Ethereum Code, in which he speaks about the core idea underlying the product. In his own words, the idea to create Ethereum Code came to him after he started talking to one of his office colleagues who had been able to amass quite a massive fortune by investing in Bitcoin at the turn of the last decade.
Let's review Ethereum Code bot and see if the automatic cryptocurrency trading platform is a scam or it is a legit software program that can truly produce real profits
ame of Marc Weston. In his own words, prior to creating this novel trading bot, he worked as a backend staff member for a number of different multinational software corporations. However, he then decided to ditch his 9-5 career in favor of becoming a full-time crypto trader. Whether or not the validity behind Marc Weston is real or not remains to be seen as there is not much information made available to research more about his story and credentialsStep 1: Account Creation

https://preview.redd.it/kjzalrsi76f51.jpg?width=770&format=pjpg&auto=webp&s=27b132c4e07f8a0f15580f68529e0b7d35af9070
As with any exchange platform. Ethereum Code also requires users to create their personal trading account. To do this, one has to click on the ‘start now’ button located at the top right side of the app’s landing page. Following this, users need to fill out their details (such as name, email address, country of residence, etc).

Before submitting any money into the app, users are provided with an opportunity to run a full trial of the algorithms that govern the trading bot. In order to do this, all one has to do is go and click on the “Go to Demo”. Following this, the platform will automatically take you to a simulation zone where you will be given $1,500 worth of virtual funds. These funds can then be used for trading in real-life market conditions without any risks. The entire process is quite simple and it is recommended that the bot be left to do its thing for a period of at least 30 to 45 minutes.
Once the aforementioned demo trading phase has concluded, users can proceed to deposit funds into their personal Ethereum Code accounts (given that they are satisfied with the overall efficacy of the platform first). In order to make a deposit, one has to click on the “Go to Live” button and then proceed with the deposit. Once the process has been initiated, users will be sent a message from the app’s admin staff. The entire process is quite streamlined but a deposit can take a few days to execute
Step 4: Start the Trading Process
Once all of the initial formalities are done, users can start using Ethereum Code by simply clicking on the red “Off” button under the “Auto Trading” tab. As soon as this is done, the app starts to source out the best available trade opportunities available at that given moment.
What Makes Ethereum Code Bot Better Than Others?

https://preview.redd.it/5p23njrj76f51.jpg?width=730&format=pjpg&auto=webp&s=61c00f9dee447b9b91a03900477fd9027bd13135
While none of this is factually provable upon reviewing The Ethereum Bot trading service, here is what the official website claims:
(i) Easy to use: When compared to a number of other similar products available in the market today, Ethereum Code is extremely simple and straightforward to use
(al any details related to their credit/debit cards or Paypal account.
As a quick recap on the trending automated cryptocurrency trading software platform, let's answer the most pressing questions about Ethereum Code Bot
A: The Ethereum Code Bot claims to be an automated trading software that analyzes current trading trends in the market using a state-of-the-art algorithm that automatically makes trading bets on your behalf. The accuracy, legitimacy and veracity of the Ethereum Code is largely unproven and unfounded despite the numerous claims of producing real results and profits for users.

A: According to the official website, the Ethereum Code trading bot says the sky is the limit about profits it can earn for traders. Depending on the initial amount deposited, users are expected to earn daily profits which is all outlined in the members area. While the Ethereum Bot makers boldly state and would like to have you believe thousands of dollars are possible a day, it is likely not a reality for the majority if any. However, one might suspect some profit is available given the glorification of their trading bot software but it is a buyer beware opportunity to say the least. High risk, medium reward is possibly another way of putting it.: Apparently this amazing auto-trading bot is free to use for all users. It is unclear how much the full Ethereum Code software service price is despite being free to
While there is never a magic software that can help investors get rich overnight, Ethereum Code seems to be a risky yet semi-legit tool that can allow users to maximize their crypto returns in the easiest, most hassle-free manner possible.

To start making use of the ‘ground-breaking, automated cryptocurrency trading application', all one has to do is go to the official company website and follow the instructions that have been outlined there. The entire process is quite simple and should not take more than 10-15 minutes to complete. Just beware of the inherent risks in using an automated trading platform and service, as either are capable of failing and disappearing overnight.

From using the actual Ethereum Code bot trading service and it being a scam to losing money on its automated investments, either are both possibilities that everyone should keep in mind if opting to try the Ethereum Code. There are other opportunities that may fit the scope of what your goals and needs are in terms of generating cryptocurrency profits like The Investment of the Decade and Crypto Income Quarterly that may be exactly what you are looking for versus using the automated crypto trading bot service, the Ethereum Code.
https://www.cryptoerapro.com/the-crypto-genius/
submitted by cryptoerapro to u/cryptoerapro [link] [comments]

LET'S TALK ABOUT THE BITCOIN HALVING!! REAL TALK.. BITCOIN, LITECOIN, ETHEREUM, DIGIBYTE UPDATE!! Cryptocurrency Crash? Lets talk Ethereum, Bitcoin, Litecoin and ICOs. Lets Talk About Bitcoin & Cryptocurrency In 2020 Let's Talk ETC! (Ethereum Classic) #54 - Bob Summerwill of ETH Project - On ETC & ETH Cooperating SUNDAY NIGHT CHARTS - LET'S TALK BITCOIN, LITECOIN, ETHEREUM

The LTB Network provides a tokenized platform for podcasts, articles, and forums about the ideas, people, and projects building the new digital economy and the future of money. Whenever there is a talk about Bitcoin or cryptocurrencies, you will hear about Ethereum and smart contracts. You can call Bitcoin as the King of the cryptocurrency market and Ethereum as the father of smart contracts. You may hear from reputable sources that many new innovations and cryptocurrency projects were built on Ethereum blockchain. Let’s find out Is it still worth to invest in ... Let’s Talk Bitcoin! #421 Stone Money and Echoes of the Past. On Today’s Show… We’re listening to echoes of the past. In the midst of the biggest bubble yet, finding ourselves in the „then they laugh at you“ phase, and with the china narrative rising for the first time as trade volumes overtook the rest of the world, I was joined by Stephanie Murphy and Andreas M. Antonopoulos in ... Now let's talk about wrapped Bitcoin, or WBTC, which is an ERC-20 token representing Bitcoin on the Ethereum blockchain. WBTC is minted by locking a corresponding amount of BTC with a custodian. Why do we need WBTC? Simple, the Ethereum ecosystem, which includes DeFi, provides outstanding opportunities for yield generation. These are not available on the Bitcoin blockchain. On today’s episode of Let’s Talk Bitcoin! you’re invited to join Andreas M. Antonopoulos, Adam B. Levine, Stephanie Murpy and special guest Richard Myers for an in-depth look at the past, present and future of ‘Mobile Mesh Networking’ technology and the open source LOT49 protocol built on top of lightning.

[index] [22871] [44345] [23262] [30674] [14512] [32986] [38538] [26216] [15792] [36090]

LET'S TALK ABOUT THE BITCOIN HALVING!! REAL TALK.. BITCOIN, LITECOIN, ETHEREUM, DIGIBYTE UPDATE!!

LET'S TALK ABOUT THE BITCOIN HALVING!! REAL TALK.. BITCOIN, LITECOIN, ETHEREUM, DIGIBYTE UPDATE!! Not Financial Advice! Entertainment only! BTC Address for those who want to support my work ... Technical Analysis and Teaching charts with Bitcoin, Litecoin, Ethereum, DGB, ADA, XRP, XTZ and more. Skip navigation Sign in. Search Lets talk Ethereum, Bitcoin, Litecoin and ICOs. ... I talk about ICO's like Civic. I mention Altcoins like Numeria. And Finally I chat about creating my own mobile decentralised database as an ... Let's Talk ETC! provides timely news about ETC (Ethereum Classic) and related technologies including: Ethereum, Bitcoin, blockchains and more. Feel free to leave requests, questions and comments ...

#