Best GPU for Mining - Your Top 8 Choices for 2020, 2025 ...

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Why i’m bullish on Zilliqa (long read)

Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analysed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralised and scalable in my opinion.
 
Below I post my analysis why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since end of January 2019 with daily transaction rate growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralised and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. Maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realised early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralised, secure and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in amount of nodes. More nodes = higher transaction throughput and increased decentralisation. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue disecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as:
“A peer-to-peer, append-only datastore that uses consensus to synchronise cryptographically-secure data”.
 
Next he states that: >“blockchains are fundamentally systems for managing valid state transitions”.* For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralised and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimisation on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (>66%) double spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT etc. Another thing we haven’t looked at yet is the amount of decentralisation.
 
Decentralisation
 
Currently there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralised nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching their transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public.They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers.The 5% block rewards with an annual yield of 10.03% translates to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS & shard nodes and seed nodes becoming more decentralised too, Zilliqa qualifies for the label of decentralised in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. Faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time stamped so you’ll start right away with a platform introduction, R&D roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalised: programming languages can be divided into being ‘object oriented’ or ‘functional’. Here is an ELI5 given by software development academy: > “all programmes have two basic components, data – what the programme knows – and behaviour – what the programme can do with that data. So object-oriented programming states that combining data and related behaviours in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behaviour are different things and should be separated to ensure their clarity.”
 
Scilla is on the functional side and shares similarities with OCaml: > OCaml is a general purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognised by academics and won a so called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities safety is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa for Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue:
In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships  
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organisations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggest that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already taking advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, AirBnB, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are build on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”*
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They dont just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities) also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiatives (correct me if I’m wrong though). This suggest in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures & Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

The Problem with PoW

The Problem with PoW
Miners have always had it rough..
"Frustrated Miners"

The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types

While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

2 Guys 1 ASIC

One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization

This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs

With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

All is not lost thanks to.. um.. Technology?

Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"

If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoCurrency [link] [comments]

What is SUQA project – How to mine SUQA coin

What is SUQA project – How to mine SUQA coin
SUQA coin is fast (533 Transactions per second) and almost No Transaction Fees. This makes it unique comparing with other projects.
SUQA is a new digital currency that has a new X22i POW algo. X22i algo is not a copy or clone of any old one but it is completely ASIC, FPGA and Quantum Resistant.
SUQA project also have a newest feature that u can 5% apr interest from term deposits even if the wallet is offline.
SUQA project aims to create the hub with SUQA payments and without taking any fees.
SUQA has a unique time-lock interest system for every user to earn deposits from their wallets without technical knowledge.
Since SUQA project is a new, unique ASIC, FPGA, Quantum resistant algorithm is very helpful for the mining community and solves numerous problems for miners. For example if you have a one GPU or only one rig you will be able to mine to the last block of SUQA without comparing any ASIC or FPGA device because SUQA will renew the algo every 6 months.
In real life SUQA will support the its community by foundation activities like mining, donations, bounty campaigns. ect.SUQA coin can be mined by GPUs (Nvidia and AMD) and CPUs. But it is only possible to mine it profitable by GPUs. SUQA unique X22i algo is also energy efficient and make your GPUs to less heat.
How To Mine SUQA coin
To mine SUQA coin 1) you need to find a miner software based on the GPU you have. 2) Download wallet and get a SUQA wallet address. 3) And then select a good pool.
Suqa Coin Algo: X22i
MINER:
– AMD:zjazz_amd_minerhttps://github.com/zjazz/zjazz_amd_minereleases
– NVIDIA:ccminer x22i win64 binary releasehttps://github.com/SUQAORG/ccminer-x22i/releases
zjazz_cuda_minerhttps://github.com/zjazz/zjazz_cuda_minereleases/tag/1.0
Trexhttps://bitcointalk.org/index.php?topic=4432704.0
Example for trex miner .bat code:t-rex.exe -a x22i -o stratum+tcp://suqa-pool.beepool.org:9504 -u YourSUQAWallet.WorkerName -p c=SUQA -R 3

https://preview.redd.it/3k92xaieh8221.jpg?width=757&format=pjpg&auto=webp&s=5b5d8259fd9847a9c003187ed7f8fa5ae14e5de2

SUQA coin hashrate, SUQA coin mining performance

Nvidia GTX 1070 – 6.4MH/s (Gigabyte GTX 1070 OC )– Nvidia GTX 1070 Ti – 7.5 Mh/s (Gigabyte GTX 1070Ti OC )– Nvidia GTX 1080Ti – 11.5 Mh/s (MSI GTX 1080Ti)
– AMD RX580 – 3.2 Mh/s (Sapphire RX 580)

SUQA coin Mining Profitability

Mining SUQA coin you will get this: SUQA price is now ~ $0,005546 (138 satoshi)
– 120 SUQA / GTX 1080Ti daily– 75 SUQA / GTX 1070 Ti daily– 65 SUQA / 7x GTX 1070 daily
SUQA coin Mining Pools: https://bsod.pw https://icemining.ca https://beepool.org/coindetail/suqaand for more pools https://discord.gg/ArttaNA
SUQA coin Difficulty: 8824.28235 @ block 22 179
SUQA coin block reward: 5000
SUQA coin Exchange Markets:Suqa coin is now on
Official Exchanges – Check on CoinMarketCap
  1. Escodexhttps://wallet.escodex.com/market/ESCODEX.SUQACOIN_ESCODEX.BTC
  2. Stexhttps://app.stex.com/en/basic-trade/paiBTC/SUQA/1D
  3. QBTChttps://www.myqbtc.com/trade
And Crypto-Bridge is on the wayhttps://wallet.crypto-bridge.org/market/BRIDGE.SUQA_BRIDGE.BTC
Website: https://suqa.org
Whitepaper: https://suqa.org/file/2018/10/suqa-whitepaper.pdf
BitcoinTalk: https://bitcointalk.org/index.php?topic=5038269.0
GitHub: https://github.com/SUQAORG
Twitter: https://twitter.com/SUQAfoundation
Facebook: https://facebook.com/SUQAFoundation
Discord: https://discord.gg/ArttaNA
Author: uk baxoi
profile: https://bitcointalk.org/index.php?action=profile;u=2364181
submitted by mahdi32 to gpumining [link] [comments]

The Problem with PoW


Miners have always had it rough..
"Frustrated Miners"


The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.
Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.
2 Guys 1 ASIC
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.
Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.
The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.
All is not lost thanks to.. um.. Technology?
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"
If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.
In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to EtherMining [link] [comments]

Swiss Alps Mining

Swiss Alps Mining & Energy Logo, Source: swissalpsmining.io
In the world of cryptocurrencies, money is not created, it is discovered and this process is known as mining. Mining started off simple, it could be done with the hardware of any computer equipment but with the passage of time the activity became profitable, leaving behind the inefficient equipment, evolving into RIG and ASIC mining farms.
Nowadays, mining farms are being built seeking the best hardware conditions with an efficient hashrate, the lowest possible electrical consumption, and favorable environmental conditions to reduce energy consumption with 'natural cooling', combining eco-friendly solutions.
Swiss Alps Energy AG (SAE) is Swiss Alps Mining & Energy operating business combining: unused buildings in the Swiss Alps, environmentally friendly mining and a sophisticated mining farm called ‘Modular Cube System’ (SAM Cubes) which evolved the ‘how-to’ of a mining farm’s development.
SAE will rent out entire cubes or individual mining capacities, guaranteeing highly energy efficient and cost competitive mining. SAE will allow the possibility to rent mining facilities and the power needed can be paid in SAM tokens, offered on SAE Initial Coin Offer (ICO). Customers may also purchase cubes for their own use obtaining the necessary power from SAE and operating the cube on SAE’s premises, or deploy the cube elsewhere.
Key factors of the SAE mining farms.
In order to be profitable, it is essential to be efficient in all the factors that influence the cryptocurrency mining. SAE has taken into account a solution for all relevant factors:
Infrastructure: they take on disused structures that cannot be inhabited, which avoids the high costs in commercial property rentals.
A recovered alpine house for mining, source: swissalpsmining.io
SAM Cube: mining is evolving in the creation of mining farms within adapted containers which facilitates the transport of mining farms to a new location. SAE developed its own mobile farm in prefabricated and fast-mountable SAM Cubes. The Cubes improve the transport and design of containers used for maritime transportation. They are self-contained and operate autonomously thus achieving a maximum efficient mining for maximum profit. SAM Cubes have two models available.
https://swissalpsmining.io/
submitted by daupreapnemem1986 to Crypto_ICO_Investing [link] [comments]

The Problem with PoW

The Problem with PoW

Miners have always had it rough..
"Frustrated Miners"


The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.
Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.
2 Guys 1 ASIC
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.
Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.
The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.
All is not lost thanks to.. um.. Technology?
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"
If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.
In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to gpumining [link] [comments]

The Problem with PoW

"Frustrated Miners"

The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types

While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

2 Guys 1 ASIC

One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization

This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs

With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called Verus Coin (https://veruscoin.io/) and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

All is not lost thanks to.. um.. Technology?

Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"

If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoTechnology [link] [comments]

The rise of specialized hardware (particularly FPGAs) and its impact on the mining community

The rise of specialized hardware (particularly FPGAs) and its impact on the mining community

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.

In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.

Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.

When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.

This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.

Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

The Arms Race of the Geek
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.

Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.

When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.

This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).

This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.

The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.

A real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called Verus Coin (https://veruscoin.io/) and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.

Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.

Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

Inclusive Hardware Equalization, Security, Decentralization
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”

In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.

If other projects adopt Verus’ new algorithm- VerusHash 2.0, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing presents an overall unprecedented level of decentralization not seen in cryptocurrency.

Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, VerusHash 2.0 is a shining beacon of hope and a lasting testament to the project’s unwavering dedication to it’s vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoTechnology [link] [comments]

4 MH/S Scrypt Miner - GPU Litecoin mining rig Is Mining Bitcoin with Solar Power Really Worth Your Money ... How Much Can You Make Mining Bitcoin on Solar 24/7 With ... Crypto Mining Rig 226 Mh/s TheBitcoinMiner - YouTube

If you are looking for an excellent mining GPU that consumes less energy and generates more hash power, the GTX 1070 is perfect. The GTX 1070 comes with 8GB GDDR5 Ram. The GPU produces a high hash rate, around 30 mh / s without consuming much energy. The GPU comes with a 3 year warranty, so you can use it without worries. Its clock speed is ... Although unlike Bitcoin, their total energy consumption is not transparent and cannot be as easily measured. The total Bitcoin network hash rate is publicly available and can be used to estimate the network's total electricity costs. Bitcoin mining has been designed to become more optimized over time with specialized hardware consuming less energy, and the operating costs of mining should ... MiningCave is worldwide distributor offering after sales service, technical support and repair center in Cryptocurrency Mining Hardware. We sell the Best Product on the Market, ASIC and GPU Mining Hardware Bitcoin Miner, Litecoin Miner, Ethereum Miner and every new model on the market. We are based in Canada. Here’s a list of top GPUs best for mining the top two cryptocurrencies – Ethereum and Bitcoin: Best GPU for mining Ethereum 1. AMD RX 470/570. The AMD RX 470/570 has a power draw of 80W to 200W, a hashrate of 20 to 30 Mh/s, and a VRAM of 4 GB and 8GB GDDR5 making it the most energy-efficient series out in the market. With the use of proper ... The Top 14 Hardware Setups for Crypto Mining. If you want to mine cryptocurrency, then you need the right hardware. If you’ve got the right hardware, then you can start earning cryptocurrencies immediately. Today, we’re highlighting the top 10 best hardware setups for crypto mining.

[index] [28357] [4666] [31985] [33000] [15751] [13423] [43093] [44538] [11709] [39303]

4 MH/S Scrypt Miner - GPU Litecoin mining rig

5 R9 290 ATI graphic cards mining litecoin at ~ 4 MH/S stable. Mine dogecoin, litecoin or any scrypt coin with this gpu mining rig. Bitcoin Mining vs. GPU Mining & How to Assemble a Mining Rig for Ethereum, Dash, LBRY & More Part 4 - Duration: 7 minutes, 24 seconds. TheBitcoinMiner 33,125 views NEW CHANNELS: TechStockHouse: https://www.youtube.com/channel/UC8IzbdgN-IDXlWH0dNdgGag TechCryptoHouse: https://www.youtube.com/channel/UCp7Gqpl9Kqiggr_Rs03X... SUBSCRIBE FOR MORE HOW MUCH - http://shorturl.at/arBHL Here is the update you have been asking me for. This solar experiment is a success! and... it can game... SUBSCRIBE FOR MORE HOW MUCH - http://shorturl.at/arBHL Nviddia GTX 1080 Ti - https://amzn.to/2Hiw5xp 6X GPU Mining Rig Case - https://bitcoinmerch.com/produc...

#