Why isn't the size of the blockchain a serious problem for ...

Bitcoin: Is the Block Chain Getting too Big?

Bitcoin: Is the Block Chain Getting too Big? submitted by _CapR_ to peercoin [link] [comments]

Bitcoin: Is the Block Chain Getting too Big?

Bitcoin: Is the Block Chain Getting too Big? submitted by crypto_coiner to Bitcoin [link] [comments]

Bitcoin: Is the Block Chain Getting too Big? {Video}

Bitcoin: Is the Block Chain Getting too Big? {Video} submitted by JPSchaumleffel to CryptoCurrency [link] [comments]

[OWL WATCH] Waiting for "IOTA TIME" 20; Hans's re-defined directions for DLT

Disclaimer: This is my editing, so there could be some misunderstandings...
--------------------------------------------
wellwho오늘 오후 4:50
u/Ben Royce****how far is society2 from having something clickable powered by IOTA?
Ben Royce오늘 오후 4:51
demo of basic tech late sep/ early oct. MVP early 2021
---------------------------------------------------
HusQy
Colored coins are the most misunderstood upcoming feature of the IOTA protocol. A lot of people see them just as a competitor to ERC-20 tokens on ETH and therefore a way of tokenizing things on IOTA, but they are much more important because they enable "consensus on data".
Bob
All this stuff already works on neblio but decentralized and scaling to 3500 tps
HusQy
Neblio has 8 mb blocks with 30 seconds blocktime. This is a throughput of 8 mb / 30 seconds = 267 kb per second. Transactions are 401+ bytes which means that throughput is 267 kb / 401 bytes = 665 TPS. IOTA is faster, feeless and will get even faster with the next update ...
-----------------------------------------------------------------------------
HusQy
Which DLT would be more secure? One that is collaboratively validated by the economic actors of the world (coporations, companies, foundations, states, people) or one that is validated by an anonymous group of wealthy crypto holders?
HusQy
The problem with current DLTs is that we use protection mechanisms like Proof of Work and Proof of Stake that are inherently hard to shard. The more shards you have, the more you have to distribute your hashing power and your stake and the less secure the system becomes.
HusQy
Real world identities (i.e. all the big economic actors) however could shard into as many shards as necessary without making the system less secure. Todays DLTs waste trust in the same way as PoW wastes energy.
HusQy
Is a secure money worth anything if you can't trust the economic actors that you would buy stuff from? If you buy a car from Volkswagen and they just beat you up and throw you out of the shop after you payed then a secure money won't be useful either :P
HusQy
**I believe that if you want to make DLT work and be successful then we need to ultimately incorporate things like trust in entities into the technology.**Examples likes wirecard show that trusting a single company is problematic but trusting the economy as a whole should be at ...
**... least as secure as todays DLTs.**And as soon as you add sharding it will be orders of magnitude more secure. DLT has failed to deliver because people have tried to build a system in vacuum that completely ignores things that already exist and that you can leverage on.
----------------------------------------------------------------------------------
HusQy
Blockchain is a bit like people sitting in a room, trying to communicate through BINGO sheets. While they talk, they write down some of the things that have been said and as soon as one screams BINGO! he hands around his sheet to inform everybody about what has been said.
HusQy
If you think that this is the most efficient form of communication for people sitting in the same room and the answer to scalability is to make bigger BINGO sheets or to allow people to solve the puzzle faster then you will most probably never understand what IOTA is working on.
--------------------------------------------------------------------------------
HusQy
**Blockchain does not work with too many equally weighted validators.****If 400 validators produce a validating statement (block) at the same time then only one can survive as part of a longest chain.**IOTA is all about collaborative validation.
**Another problem of blockchain is that every transaction gets sent twice through the network. Once from the nodes to the miners and a 2nd time from the miners as part of a block.**Blockchain will therefore always only be able to use 50% of the network throughput.
And****the last problem is that you can not arbitrarily decrease the time between blocks as it breaks down if the time between blocks gets smaller than the average network delay. The idle time between blocks is precious time that could be used for processing transactions.
-----------------------------------------------------------------------------
HusQy
I am not talking about a system with a fixed number of validators but one that is completely open and permissionless where any new company can just spin up a node and take part in the network.
------------------------------------------------------------------------
HusQy
Proof of Work and Proof of Stake are both centralizing sybil-protection mechanism. I don't think that Satoshi wanted 14 mining pools to run the network.
And "economic clustering" was always the "end game" of IOTA.
-----------------------------------------------------------------------------
HusQy
**Using Proof of Stake is not trustless. Proof of Stake means you trust the richest people and hope that they approve your transactions. The rich are getting richer (through your fees) and you are getting more and more dependant on them.**Is that your vision of the future?
----------------------------------------------------------------------------

HusQy
Please read again exactly what I wrote. I have not spoken of introducing governance by large companies, nor have I said that IOTA should be permissioned. We aim for a network with millions or even billions of nodes.

HusQy
That can't work at all with a permissioned ledger - who should then drop off all these devices or authorize them to participate in the network? My key message was the following: Proof of Work and Proof of Stake will always be if you split them up via sharding ...

HusQy
... less secure because you simply need fewer coins or less hash power to have the majority of the votes in a shard. This is not the case with trust in society and the economy. When all companies in the world jointly secure a DLT ...

HusQy
... then these companies could install any number of servers in any number of shards without compromising security, because "trust" does not become less just because they operate several servers. First of all, that is a fact and nothing else.

HusQy
Proof of Work and Proof of Stake are contrary to the assumption of many not "trustless" but follow the maxim: "In the greed of miners we trust!" The basic assumption that the miners do not destroy the system that generates income for them is fundamental here for the ...

HusQy
... security of every DLT. I think a similar assumption would still be correct for the economy as a whole: The companies of the world (and not just the big ones) would not destroy the system with which their customers pay them. In this respect, a system would be ...

HusQy
... which is validated by society and the economy as a whole probably just as "safely" as a system which is validated by a few anonymous miners. Why a small elite of miners should be better validators than any human and ...

HusQy
... To be honest, companies in this world do not open up to me. As already written in my other thread, safe money does not bring you anything if you have to assume that Volkswagen will beat you up and throw you out of the store after you ...

HusQy
... paid for a car. The thoughts I discussed say nothing about the immediate future of IOTA (we use for Coordicide mana) but rather speak of a world where DLT has already become an integral part of our lives and we ...

HusQy
... a corresponding number of companies, non-profit organizations and people have used DLT and where such a system could be implemented. The point here is not to create a governance solution that in any way influences the development of technology ...

HusQy
... or have to give nodes their OK first, but about developing a system that enables people to freely choose the validators they trust. For example, you can also declare your grandma to be a validator when you install your node or your ...

HusQy
... local supermarket. Economic relationships in the real world usually form a close-knit network and it doesn't really matter who you follow as long as the majority is honest. I also don't understand your criticism of censorship, because something like that in IOTA ...

HusQy
... is almost impossible. Each transaction confirms two other transactions which is growing exponentially. If someone wanted to ignore a transaction, he would have to ignore an exponential number of other transactions after a very short time. In contrast to blockchain ...

HusQy
... validators in IOTA do not decide what is included in the ledger, but only decide which of several double spends should be confirmed. Honest transactions are confirmed simply by having other transactions reference them ...

HusQy
... and the "validators" are not even asked. As for the "dust problem", this is indeed something that is a bigger problem for IOTA than for other DLTs because we have no fees, but it is also not an unsolvable problem. Bitcoin initially has a ...

HusQy
Solved similar problem by declaring outputs with a minimum amount of 5430 satoshis as invalid ( github.com/Bitcoin/Bitcoi…). A similar solution where an address must contain a minimum amount is also conceivable for IOTA and we are discussing ...

HusQy
... several possibilities (including compressing dust using cryptographic methods). Contrary to your assumption, checking such a minimum amount is not slow but just as fast as checking a normal transaction. And mine ...

HusQy
... In my opinion this is no problem at all for IOTA's use case. The important thing is that you can send small amounts, but after IOTA is feeless it is also okay to expect the recipients to regularly send their payments on a ...

HusQy
... merge address. The wallets already do this automatically (sweeping) and for machines it is no problem to automate this process. So far this was not a problem because the TPS were limited but with the increased TPS throughput of ...

HusQy
... Chrysalis it becomes relevant and appropriate solutions are discussed and then implemented accordingly. I think that was the most important thing first and if you have further questions just write :)

HusQy
And to be very clear! I really appreciate you and your questions and don't see this as an attack at all! People who see such questions as inappropriate criticism should really ask whether they are still objective. I have little time at the moment because ...

HusQy
... my girlfriend is on tour and has to take care of our daughter, but as soon as she is back we can discuss these things in a video. I think that the concept of including the "real world" in the concepts of DLT is really exciting and ...

HusQy
... that would certainly be exciting to discuss in a joint video. But again, that's more of a vision than a specific plan for the immediate future. This would not work with blockchain anyway but IOTA would be compatible so why not think about such things.
-----------------------------------------------------------------------

HusQy
All good my big one :P But actually not that much has changed. There has always been the concept of "economic clustering" which is basically based on similar ideas. We are just now able to implement things like this for the first time.
----------------------------------------------------------------------------------

HusQy
Exactly. It would mean that addresses "cost" something but I would rather pay a few cents than fees for each transaction. And you can "take" this minimum amount with you every time you change to a new address.

HusQy
All good my big one :P But actually not that much has changed. There has always been the concept of "economic clustering" which is basically based on similar ideas. We are just now able to implement things like this for the first time.
-----------------------------------------------------------------------------------

Relax오늘 오전 1:17
Btw. Hans (sorry for interrupting this convo) but what make people say that IOTA is going the permissioned way because of your latest tweets? I don't get why some people are now forecasting that... Is it because of missing specs or do they just don't get the whole idea?

Hans Moog [IF]오늘 오전 1:20
its bullshit u/Relaxan identity based system would still be open and permissionless where everybody can choose the actors that they deem trustworthy themselves but thats anyway just sth that would be applicable with more adoption
[오전 1:20]
for now we use mana as a predecessor to an actual reputation system

Sissors오늘 오전 1:31
If everybody has to choose actors they deem trustworthy, is it still permissionless? Probably will become a bit a semantic discussion, but still

Hans Moog [IF]오늘 오전 1:34
Of course its permissionless you can follow your grandma if you want to :p

Sissors오늘 오전 1:36
Well sure you can, but you will need to follow something which has a majority of the voting power in the network. Nice that you follow your grandma, but if others dont, her opinion (or well her nodes opinion) is completely irrelevant

Hans Moog [IF]오늘 오전 1:37
You would ideally follow the people that are trustworthy rather than your local drug dealers yeah

Sissors오늘 오전 1:38
And tbh, sure if you do it like that is easy. If you just make the users responsible for only connection to trustworthy nodes

Hans Moog [IF]오늘 오전 1:38
And if your grandma follows her supermarket and some other people she deems trustworthy then thats fine as well
[오전 1:38]
+ you dont have just 1 actor that you follow

Sissors오늘 오전 1:38
No, you got a large list, since yo uwant to follow those which actually matter. So you jsut download a standard list from the internet

Hans Moog [IF]오늘 오전 1:39
You can do that
[오전 1:39]
Is bitcoin permissionless? Should we both try to become miners?
[오전 1:41]
I mean miners that actually matter and not find a block every 10 trillion years 📷
[오전 1:42]
If you would want to become a validator then you would need to build up trust among other people - but anybody can still run a node and issue transactions unlike in hashgraph where you are not able to run your own nodes(수정됨)
[오전 1:48]
Proof of Stake is also not trustless - it just has a builtin mechanism that downloads the trusted people from the blockchain itself (the richest dudes)

Sissors오늘 오전 1:52
I think most agree it would be perfect if every person had one vote. Which is pr oblematic to implement of course. But I really wonder if the solution is to just let users decide who to trust. At the very least I expect a quite centralized network

Hans Moog [IF]오늘 오전 1:53
of course even a trust based system would to a certain degree be centralized as not every person is equally trustworthy as for example a big cooperation
[오전 1:53]
but I think its gonna be less centralized than PoS or PoW
[오전 1:53]
but anyway its sth for "after coordicide"
[오전 1:54]
there are not enough trusted entities that are using DLT, yet to make such a system work reasonably well
[오전 1:54]
I think the reason why blockchain has not really started to look into these kind of concepts is because blockchain doesnt work with too many equally weighted validators
[오전 1:56]
I believe that DLT is only going to take over the world if it is actually "better" than existing systems and with better I mean cheaper, more secure and faster and PoS and PoW will have a very hard time to deliver that
[오전 1:56]
especially if you consider that its not only going to settle value transfers

Relax오늘 오전 1:57
I like this clear statements, it makes it really clear that DLT is still in its infancy

Hans Moog [IF]오늘 오전 1:57
currently bank transfers are order of magnitude cheaper than BTC or ETH transactions

Hans Moog [IF]오늘 오전 1:57
and we you think that people will adopt it just because its crypto then I think we are mistaken
[오전 1:57]
The tech needs to actually solve a problem
[오전 1:57]
and tbh. currently people use PayPal and other companies to settle their payments
[오전 1:58]
having a group of the top 500 companies run such a service together is already much better(수정됨)
[오전 1:58]
especially if its fast and feeless
[오전 2:02]
and the more people use it, the more decentralized it actually becomes
[오전 2:02]
because you have more trustworthy entities to choose of

Evaldas [IF]오늘 오전 2:08
"in the greed of miners we trust"


submitted by btlkhs to Iota [link] [comments]

[OWL WATCH] Waiting for "IOTA TIME" 30;

Disclaimer: This is sort of my own arbitrary editing, so there could be some misunderstandings.
I root for the spread of good spirits and transparency of IF.
📷
Hans Moog [IF]어제 오후 2:45
So why don't we just copy Avalanche? Well that's pretty simple ...
📷
Hans Moog [IF]어제 오후 2:47
1. It doesn't scale very well with the amount of nodes in the network that have no say in the consensus process but are merely consensus consuming nodes (i.e. sensors, edge devices and so on). If you assume that the network will never have more than a few thousand nodes then thats fine but if you want to build a DLT that can cope with millions of devices then it wont work because of the message complexity.
2. If somebody starts spamming conflicts, then the whole network will stop to confirm any transactions and will grind to a halt until the conflict spamming stops. Avalanche thinks that this is not a huge problem because an attacker would have to spend fees for spamming conflicts which means that he couldn't do this forever and would at some point run out of funds.
IOTA tries to build a feeless protocol and a consensus that stops to function if somebody spams conflicts is really not an option for us.
3. If a medium sized validator goes offline due to whatever reason, then the whole network will again stop to confirm any transactions because whenever a query for a nodes opinion can not be answered they reset the counter for consecutive successful voting rounds which will prevent confirmations. Since nodes need to open some ports to be available for queries it is super easy to DDOS validators and again bring the network confirmations to 0.
📷
Hans Moog [IF]어제 오후 3:05
4. Avalanche still processes transactions in "chunks/blocks" by only applying them after they have gone through some consensus process (gathered enough successfull voting rounds), which means that the nodes will waste a significant amount of time where they "wait" for the next chunk to be finished before the transactions are applied to the ledger state. IOTA tries to streamline this process by decoupling consensus and the booking of transactions by using the "parallel reality based ledger state" which means that nodes in IOTA will never waste any time "waiting" for decisions to be made. This will give us much higher throughput numbers.
📷
Hans Moog [IF]어제 오후 3:11
5. Avalanche has some really severe game theoretic problems where nodes are incentivized to attach their transactions to the already decided parts of the DAG because then things like conflict spam won't affect these transactions as badly as the transactions issued by honest nodes. If however every node would follow this "better and selfish" tip selection mechanism then the network will stop to work at all.
Overall the "being able to stop consensus" might not be too bad since you can't really do anything really bad (i.e. double spend) which is why we might not see these kind of attacks in the immediate future but just wait until a few DeFi apps are running on their platform where smart contracts are actually relying on more or less real time execution of the contracts. Then there might be some actual financial gains to be made if the contract halts and we might see alot of these things appear (including selfish tip selection).
Avalanche is barely a top 100 project and nobody attacks these kind of low value networks unless there is something to be gained from such an attack. Saying that the fact that its live on mainnet and hasn't been attacked in 3 weeks is a proof for its security is completely wrong.
Especially considering that 95% of all stake are controlled by avalanche itself
If you control > 50% of the voting power then you essentially control the whole network and attacks can mostly be ignored
I guess there is a reason for avalanche only selling 10% of the token supply to the public because then some of the named problems are less likely to appear
📷
Navin Ramachandran [IF]어제 오후 3:21
I have to say that wtf's suggestion is pretty condescending to all our researchers. It seems heavy on the troll aspect to suggest that we should ditch all our work because iota is only good at industrial adoption. Does wtf actually expect a response to this? Or is this grand standing?
📷
Hans Moog [IF]어제 오후 3:22
The whole argument of "why don't you just use X instead of trying to build a better version" is also a completely idiotic argument. Why did ETH write their own protocol if Bitcoin was already around? Well because they saw problems in Bitcoins approach and tried to improve it.
📷
Hans Moog [IF]어제 오후 3:27
u/Navin Ramachandran [IF] Its like most of his arguments ... remember when he said we should implement colored coins in 2nd layer smart contracts instead of the base layer because they would be more expressive (i.e. turing complete) completely discarding that 2nd layer smart contracts only really work if you have a consensus on data and therefore state for which you need the "traceability" of funds to create these kind of mini blockchains in the tangle?
Colored coins "enable" smart contracts and it wouldnt work the other way round - unless you have a platform that works exactly like ETH where all the nodes validate a single shared execution platform of the smart contracts which is not really scalable and is exactly what we are trying to solve with our approach.
📷
Navin Ramachandran [IF]어제 오후 3:28
Always easier to criticise than build something yourself. But yet he keeps posting these inflammatory posts.
At this point is there any doubt if he is making these comments constructively?
📷
Hans Moog [IF]어제 오후 3:43
If he at least would try to understand IOTAs vision ... then maybe he wouldn't have to ask things like "Why don't you just copy a tech that only works with fees"
📷
Hans Moog [IF]어제 오후 4:35
u/Shaar
I thought this would only be used to 'override' finality, eg if there were network splits. But not in normal consensus
That is not correct. Every single transaction gets booked on arrival using the parallel reality based ledger state. If there are conflicts then we create a "branch" (container in the ledger state) that represents the perception that this particular double spend would be accepted by consensus. After consensus is reached, the container is simply marked as "accepted" and all transactions that are associated with this branch are immediately confirmed as well. This allows us to make the node use all of its computing ressources 24/7 without having to wait for any kind of decision to be made and allows us to scale the throughput to its physical limits. That's the whole idea of the "parallel reality based ledger state" instead of designing a data structure that models the ledger state "after consensus" like everybody else is doing it is tailored to model the ledger state "before consensus" and then you just flip a flag to persist your decision. The "resync mechanism" also uses the branches to measure the amount of approval a certain perception of the ledger state receives. So if my own opinion is not in line with what the rest of the network has accepted (i.e. because I was eclipsed or because there was a network split), then I can use the weight of these branches to detect this "being out of sync" and can do another larger query to re-evaluate my decision.(수정됨)
Also what happens in IOTA if DRNG notes would fall out, does the network continue if no new RNGs appear for a while? Or will new nodes be added sufficiently fast to the DRNG committee that no one notices?
Its a comittee and not just a single DRNG provider. If a few nodes fail then it will still produce random numbers. And even if the whole comittee fails there are fallback RNG's that would be used instead
📷
Hans Moog [IF]어제 오후 4:58
And multiverse doesn't use FPC but only the weight of these branches in the same way as blockchain uses the longest chain wins consensus to choose between conflicts. So nodes simply attach their transactions to the transactions that they have seen first and if there are conflicts then you simply monitor which version received more approval and adjust your opinion accordingly.
📷
Hans Moog [IF]어제 오후 5:07
We started integrating some of the non-controversial concepts (like the approval reset switch) into FPC and are currently refactoring goshimmer to support this
We are also planning to make the big mana holders publish their opinion in the tangle as a public statement, which allows us to measure the rate of approval in a similar way as multiverse would do it
So its starting to converge a bit but we are still using FPC as a metastability breaking mechanism
Once the changes are implemented it should be pretty easy to simulate and test both approaches in parallel
📷
Serguei Popov [IF]어제 오후 5:53
So the ask is that we ditch all our work and fork Avalanche because it has not been attacked in the month or so it has been up?
u/Navin Ramachandran [IF] yeah, that's hilarious. Avalanche consensus (at least their WP version) is clearly scientifically unsound.
📷
Hans Moog [IF]어제 오후 9:43
u/wtf maybe you should research avalanche before proposing such a stupid idea
and you will see that what I wrote is actually true
📷
Hans Moog [IF]어제 오후 9:44
paying fees is what "protects" them atm
and simply the fact that nobody uses the network for anything of value yet
we cant rely on fees making attack vectors "inattractive"
📷
Serguei Popov [IF]어제 오후 10:17
well (1.) very obviously the metastability problems are not a problem in practice,
putting "very obviously" before questionable statements very obviously shows that you are seeking a constructive dialogue 📷 (to make metastability work, the adversary needs to more-or-less know the current opinion vectors of most of the honest participants; I don't see why a sufficiently well-connected adversary cannot query enough honest nodes frequently enough to achieve that)
(2.) .... you'd need an unpredictable number every few tens/hundreds milliseconds, but your DRNG can only produce one every O(seconds).
the above assumption (about "every few tens/hundreds milliseconds") is wrong
We've had this discussion before, where you argued that the assumptions in the FPC-BI paper (incl. "all nodes must be known") are not to be taken 100% strictly, and that the results are to be seen more of an indication of overall performance.
Aham, I see. So, unfortunately, all that time that I invested into explaining that stuff during our last conversation was for nothing. Again, very briefly. The contents of the FPC-BI paper is not "an indication of overall performance". It rather shows (to someone who actually read and understood the paper) why the approach is sound and robust, as it makes one understand what is the mechanism that causes the consensus phenomenon occur.
Yet you don't allow for that same argument to be valid for the "metastability" problem in avalanche,
Incorrect. It's not "that same argument". FPC-BI is a decent academic paper that has precisely formulated results and proofs. The Ava WP (the probabilistic part of it), on the other hand, does not contain proofs of what they call results. More importantly, they don't even show a clear path to those proofs. That's why their system is scientifically unsound.
even when there's a live network that shows that it doesn't matter.
No, it doesn't show that it doesn't matter. It only shows that it works when not properly attacked. Their WP doesn't contain any insight on why those attacks would be difficult/impossible.
📷
Hans Moog [IF]어제 오후 10:56
That proposal was so stupid - Avalanche does several things completely different and we are putting quite a bit og effort into our solution to pretty much fix all of Avalanches shortcomings
If we just wanted to have a working product and dont care about security or performance then we could have just forked a blockchaib
I am pretty confident that once we are done - its going to be extremely close to the besttheoretical thresholds that DLTs will ever be able to achieve for an unsharded baselayer
​-------------------------------------------------------------------------------------------------------------
📷
Bas어제 오전 2:43
Yesterday I was asked how a reasonably big company no one has heard of could best move forward implementing Access for thousands of locations worldwide. (Sorry for the vagueness, it’s all confidential.) They read the article and want to implement it because it seems to fit a problem they’re currently trying to solve. Such moves will vastly increase the utility of protocols like IOTA, and is what the speculation is built on. I do not think you can overestimate what impact Access is going to have. It’s cutting out the middleman for simple things; no server or service needed. That’s huge.
So yes, I think this space will continue to grow u/Coinnave

--------------------------------------------------------------------------------------------------------------
📷
Angelo Capossele [IF]2020.10.02.
In short: we are planning a new v0.3.0 release that should happen very soon. This version will bring fundamental changes to the structure of the entire codebase (but without additional features) so that progressing with the development will be easier and more consistent. We have also obtained outstanding results with the dRNG committee managed by the GoShimmer X-Team, so that will also be integral part of v0.3.0. After that, we will merge the Value Tangle with the Message Tangle, so to have only one Tangle and make the TSA and the orphanage easier to manage. And we are also progressing really well with Mana, that will be the focus after the merge. More or less this is what is going to happen this month.
We will release further details with the upcoming Research Status Update 📷

submitted by btlkhs to Iota [link] [comments]

Lition - $8 Million Dollar Market Cap With Real Use Right Now and a New Product They Are Developing Which Has Huge Potential.

Preface

I’m not usually one to shill my own coins but I’ve stolen a few good picks from this sub so I thought I’d share a new one I recently stumbled upon. Before I go into more details, I’d like to preface this by saying that I never invest in anything which I don’t think has the fundamentals to last at least 5-10 years and I don’t think this is a project which you will see a few hundred percent gains in a month or two. The hype isn’t there with this project and it’s more of a mid-long term play. If you want overnight gains, gamble on some of the smaller caps posted in this sub which are more like ponzi schemes riding on DeFi hype which you sell to a greater fool.

Introduction

Lition is a layer 2 blockchain infrastructure on top of Ethereum that enables commercial usage of dApps. The Lition protocol complements the Ethereum mainchain by adding features such as privacy, scalability and deletability for GDPR compliance. Everybody can choose to build on Lition without the need for permission.
In addition to the above, they also have a P2P energy trading platform currently operating and is supplying green power to customers in over 1000 towns and cities across Germany. Through their power platform, Lition customers are able to save about 20% on their monthly energy bill, while producers generate up to 30% higher profits since they are cutting out the middle men.
However, the real moonshot here is not their already successful smart energy platform (which utilises the same token) it is the enterprise layer 2 solution described in the quote above.
Their layer 2 enterprise infrastructure which is still in development will offer infinite scalability through sidechains and nodes staking LIT tokens on these sidechains. Block times will be fast at around 3 seconds and fees will be tiny fractions of a cent. However, the real selling point for enterprises will be that the data on these sidechains can be deleted and can be public or private, with private chains being validated via Zero-Knowledge proofs to verify that the private data is correct. This is huge and makes Lition a solution for a wide range of enterprise use cases due to these optional features. But it doesn’t stop there. Lition is also GDPR compliant - a big deal for Europe based enterprises and for the record, very very few blockchain solutions are GDPR compliant (I believe VeChain is one of the few other projects which are).

Important Bullet Points

Tokenomics

Their token has two primary uses. First, it is a utility token and they plan on making the LIT token the preferred payment method for all of the services on the Lition protocol. Secondly, it is used as collateral for staking which I can see locking up a large proportion of the supply in the future.
Unfortunately the circulating supply is currently 50% of the max supply but that said, coins like LINK have just 35% of the total tokens currently circulating, so relative to other projects, this isn’t too bad and many of the tokens are still to be earned by staking.

Conclusion

With their existing energy platform seeing real adoption and steady growth in Germany, in my opinion, this alone would be enough to justify their current market cap. However, I can see their second layer solution for enterprise being a really big deal in the future as protocol coins tend to accrue more value than utility tokens. As a versatile L2 solution for Ethereum, LIT gets the best of both worlds - adoption and network effects from Ethereum by helping it to scale as well as accruing value from the wide range of enterprise use cases which can be built on top of Lition. At just $8 million dollars in market cap, it seems to me that their work-in-progress L2 enterprise solution has not been priced in. However, due to a lack of hype and marketing right now, I don’t see LIT exploding in the short term. Rather, I can see it slowly outperforming ETH and climbing up the CMC rankings throughout this bullrun, much like Chainlink did in the bear market. Their building and partnerships over marketing strategy also reminds me when I held Chainlink back in 2018 when Sergey was busy building out the project rather than blowing their ICO money on marketing a bunch of vaporware like so many other projects.
Personally, I can see LIT becoming a top 100 project (not top 10) as it isn’t the first of an important new type of project like Chainlink was/is but it is an L2 protocol with unique advantages and selling points over other existing L2 projects which scatter the top 20-200 range. This would put the market cap at just under $120 million dollars which is a 15x from here. This is of course a valuation which assumes that the total crypto market cap remains where it is right now at just under $400 billion dollars. However, if BTC makes it to 100K and Ethereum gets to $5K then that is another 10x from here which compounds on any LIT/BTC or LIT/ETH ratio gains. In this scenario, a top 100 project would be worth around $1 BILLION DOLLARS by market cap which is over 100x from here and probably even more if ETH hits 10K and Bitcoin dominance falls back down to the 30% range or below towards the end of the bullrun. Disclaimer, the above figures are a theoretical best case scenario and are far from financial advice. They are my moonshot estimates which assumes all goes well for the project and the wider crypto space.
Website: https://www.lition.io/
CoinGecko: https://www.coingecko.com/en/coins/lition
Medium: https://medium.com/lition-blog

TL;DR

TL;DR: LIT has current real world use which is consistently growing with their P2P energy trading platform and has huge potential with their new L2 protocol for enterprise due to its unique features. They have a close partnership with SAP and are also partnered with Microsoft. Currently around #400 on CMC, my target is for LIT to be top 100 by the end of the bullrun.
Edit: Sorry 4chan, I didn't mean to shill one of your FUDed coins. Lit is a shitcoin scam, ignore this post.
submitted by Tricky_Troll to CryptoMoonShots [link] [comments]

Stakenet (XSN) - A DEX with interchain capabilities (BTC-ETH), Huge Potential [Full Writeup]

Preface
Full disclosure here; I am heavily invested in this. I have picked up some real gems from here and was only in the position to buy so much of this because of you guys so I thought it was time to give back. I only invest in Utility Coins. These are coins that actually DO something, and provide new/build upon the crypto infrastructure to work towards the end goal that Bitcoin itself set out to achieve(financial independence from the fiat banking system). This way, I avoid 99% of the scams in crypto that are functionless vapourware, and if you only invest in things that have strong fundamentals in the long term you are much more likely to make money.
Introduction
Stakenet is a Lightning Network-ready open-source platform for decentralized applications with its native cryptocurrency – XSN. It is powered by a Proof of Stake blockchain with trustless cold staking and Masternodes. Its use case is to provide a highly secure cross-chain infrastructure for these decentralized applications, where individuals can easily operate with any blockchain simply by using Stakenet and its native currency XSN.
Ok... but what does it actually do and solve?
The moonshot here is the DEX (Decentralised Exchange) that they are building. This is a lightning-network DEX with interchain capabilities. That means you could trade BTC directly for ETH; securely, instantly, cheaply and privately.
Right now, most crypto is traded to and from Centralised Exchanges like Binance. To buy and sell on these exchanges, you have to send your crypto wallets on that exchange. That means the exchanges have your private keys, and they have control over your funds. When you use a centralised exchange, you are no longer in control of your assets, and depend on the trustworthiness of middlemen. We have in the past of course seen infamous exit scams by centralised exchanges like Mt. Gox.
The alternative? Decentralised Exchanges. DEX's have no central authority and most importantly, your private keys(your crypto) never leavesYOUR possession and are never in anyone else's possession. So you can trade peer-to-peer without any of the drawbacks of Centralised Exchanges.
The problem is that this technology has not been perfected yet, and the DEX's that we have available to us now are not providing cheap, private, quick trading on a decentralised medium because of their technological inadequacies. Take Uniswap for example. This DEX accounts for over 60% of all DEX volume and facilitates trading of ERC-20 tokens, over the Ethereum blockchain. The problem? Because of the huge amount of transaction that are occurring over the Ethereum network, this has lead to congestion(too many transaction for the network to handle at one time) so the fees have increased dramatically. Another big problem? It's only for Ethereum. You cant for example, Buy LINK with BTC. You must use ETH.
The solution? Layer 2 protocols. These are layers built ON TOP of existing blockchains, that are designed to solve the transaction and scaling difficulties that crypto as a whole is facing today(and ultimately stopping mass adoption) The developers at Stakenet have seen the big picture, and have decided to implement the lightning network(a layer 2 protocol) into its DEX from the ground up. This will facilitate the functionalities of a DEX without any of the drawbacks of the CEX's and the DEX's we have today.
Heres someone much more qualified than me, Andreas Antonopoulos, to explain this
https://streamable.com/kzpimj
'Once we have efficient, well designed DEX's on layer 2, there wont even be any DEX's on layer 1'
Progress
The Stakenet team were the first to envision this grand solution and have been working on it since its conception in June 2019. They have been making steady progress ever since and right now, the DEX is in an open beta stage where rigorous testing is constant by themselves and the public. For a project of this scale, stress testing is paramount. If the product were to launch with any bugs/errors that would result in the loss of a users funds, this would obviously be very damaging to Stakenet's reputation. So I believe that the developers conservative approach is wise.
As of now the only pairs tradeable on the DEX are XSN/BTC and LTC/BTC. The DEX has only just launched as a public beta and is not in its full public release stage yet. As development moves forward more lightning network and atomic swap compatible coins will be added to the DEX, and of course, the team are hard at work on Raiden Integration - this will allow ETH and tokens on the Ethereum blockchain to be traded on the DEX between separate blockchains(instantly, cheaply, privately) This is where Stakenet enters top 50 territory on CMC if successful and is the true value here. Raiden Integration is well underway is being tested in a closed public group on Linux.
The full public DEX with Raiden Integration is expected to release by the end of the year. Given the state of development so far and the rate of progress, this seems realistic.
Tokenomics
2.6 Metrics overview (from whitepaper)
XSN is slightly inflationary, much like ETH as this is necessary for the economy to be adopted and work in the long term. There is however a deflationary mechanism in place - all trading fees on the DEX get converted to XSN and 10% of these fees are burned. This puts constant buying pressure on XSN and acts as a deflationary mechanism. XSN has inherent value because it makes up the infrastructure that the DEX will run off and as such Masternode operators and Stakers will see the fee's from the DEX.
Conclusion
We can clearly see that a layer 2 DEX is the future of crypto currency trading. It will facilitate secure, cheap, instant and private trading across all coins with lightning capabilities, thus solving the scaling and transaction issues that are holding back crypto today. I dont need to tell you the implications of this, and what it means for crypto as a whole. If Stakenet can launch a layer 2 DEX with Raiden Integration, It will become the primary DEX in terms of volume.
Stakenet DEX will most likely be the first layer 2 DEX(first mover advantage) and its blockchain is the infrastructure that will host this DEX and subsequently receive it's trading fee's. It is not difficult to envision a time in the next year when Stakenet DEX is functional and hosting hundreds of millions of dollars worth of trading every single day.
At $30 million market cap, I cant see any other potential investment right now with this much potential upside.
This post has merely served as in introduction and a heads up for this project, there is MUCH more to cover like vortex liquidity, masternodes, TOR integration... for now, here is some additional reading. Resources
TLDR; No. Do you want to make money? I'd start with learning how to read.
submitted by hotprocession to CryptoMoonShots [link] [comments]

Philly's Weekly Watchlist [LONG]

Since a few people appreciated my list last week figured I'd drop it again for everyone not just the few people that I constantly chat with
8/2 WEEKLY WATCHLIST
[P.S. Only enter positions you feel the most comfortable with. Your money is your soldier only send him into the battle you think you'll win. Some of these I have taken positions. Some I am looking to take positions. I've posted how many shares I own of what multiple times ]
💸PENNIES💸
[💎-Long time gold][⁉️-Could go both ways][🚀Rocket Emoji-I think this is gonna shoot up][🔥-This is a HOT pick][⚠️-Already ran a bit be careful][👀-Watching this one closely]
🚀💸PENNYS💸🚀
$AMTX - Golden triangle. Looks to still have fuel in the rocket. $1.10-$1.15 imo isn't a bad entry. $1.12 is the WEEKLY support. Overall support is a freefall to $0.80 I expect a $1.40-$1.45 run. PR on Tuesday⚠️👀[Rocket Emoji]🚀
$BNGO - Big virtual booth Aug 4-5th. Huge biotech upcoming company. Support at $0.74 & $0.65. Resistance at $0.82 than $0.95. This could rip up with the right volume👀🔥 [Rocket Emoji]🚀
$AIM - Web conference Monday 1:30EST. I honestly see this hitting $5 in the long future but should run up Friday into Monday. About 70% shareholders are breakeven or at a loss. Decent support at $2.72. Godly support at $2.44. Resistance at $3/$3.35.💎🔥👀[Rocket Emoji]🚀
$ATNM - Balance sheet shows easily enough money for another quarter without an offering. Earnings Aug 7th. [Estimated 56% growth]. Sabby is playing with this[scary] but this monster "should" RIP UP! Support is $0.52-$0.54. Weak support at $0.57. Resistances at $0.61/$0.64/$0.68🔥👀[Twitter pumping this too]
$BKYI - African Contracts need to be finalized and this is gonna ZOOM ZOOM ZOOM! Had a single buyer with a 200k share bid at $0.75. Looks like it made a new support at $0.69 off old resistance levels. Seems to be rough resistance at the $0.71-$0.74 range . After that could run $0.77-$0.82🔥👀[Rocket Emoji]🚀 $BIOC - Insider buys 7/14 of 20k shares. Bullish uptrend. Decent support at $0.68, $0.63 $0.60. DEC 7th until for compliance. So decent amount of time still. I'm bullish AF to $0.80 Maybe $1. Broke $0.725 resistance. Talks of a RESPLIT THOUGH! 6/25 Golden Cross![Chart if you wanna see just ask]
$CHEK - 70% of shareholders at a lose. Mad support at $0.53 area. Above $0.61 I'd be super bullish. I see an ascending triangle. This baby wants to break out.Macd is setup perfectly. Volume Friday smacked it up. This company is REALLY dedicated to pushing for $1 for conpliance!🔥💎👀
$IZEA - AUG 18th Webinar. Tiktok partnership RUMOR?!?! Insane Support at $1.02. Small resistance at $1.47 I see resistances at $1.66👀🔥⚠️[Rocket Emoji]🚀🚀🚀
$SXTC - 99% Shareholders breakeven or at loss. Had Insane support at $0.40-$0.40 and broke down. New support is $0.36. Something tells me this is an EASY gap up to $0.42-$0.44 Low float🔥[Rocket Emoji]🚀
$JFU - UNGODLY OVERSOLD 90% Shareholders breakeven or at a loss. MACD setup on daily. Should EASILY gap up to $2.40-$2.60. BITCOIN PLAY🔥[Rocket Emoji]🚀
$MARA/$RIOT -BTC Plays. Mara imo is the better option. They are debt free vs RIOT 200m debt👀🔥⚠️ [Rocket Emoji]🚀
$ENZ- Has FDA approval noone else has this test. Monopoly. Schools testing. State colleges already buying them.98% shareholders are breakeven or loss! REVENUE UP 121% IN 2019. Looks to be at support at $2.35 beyond that around $2.08. Resistainces sitting at around $2.55 and $2.70.👀
$MYT - $0.40 Offering price. I wouldnt mind getting it around $0.38-$0.42. US store in trial phase.
$DLPN - FORSEE a HUGE gap up here! Support at $0.82 than a freefall to $0.49. SMALL resistance at $0.91. Than resistance at $1/$1.07. Had an offering at $1.05 2months ago.Only scarey thing is they might split due to compliance👀[Rocket Emoji]🚀
$LPCN - FDA Approval Aug 28th. This has been a CONSTANT RUNNER 💎🔥[Rocket Emoji]🚀
$BOXL - Offering closed Friday. PR is imminent. 99% Shareholders are at a LOSS! Chart looks like a BULLISH pennant.$2.20 is OKAY support. $1.70 is pretty strong support. $2.30 looks like the first soft resistance. $2.45 gets broken we could see a $3 Run👀
$ONTX - Made compliance on Friday. Massive support at $1.12. Dropping Twitter PR like wildfire. Resistance seems to be in $0.05 invervals starting at $1.20. Afte $1.45 Its a straight RIP up to $2.65👀[Rocket Emoji]
$IDEX - Looks like old$1.38-$1.40 Support is being rebuilt. Bullish as hell if this breaks $1.51. Earnings August 11th🔥👀[Rocket Emoji]🚀 💰HONORABLE MENTIONS💰 : $VERB - [Offering at $1.10 good around that price]$NAK $UAVS $MVIS $GAU[Gold mine]🔥 $PZG[Gold mine]🔥$JAN🔥👀
💰Non-Pennys💰
$MGM - EPS was BETTER than projected. Revenue in the gutter. Didn't have the sell off i thought. Still a good price LONG. MGM is 1/3 casinos with liscensing in Japan. By 2030 this should be a $40-$45 ticker💎🔥⚠️👀 [Rocket Emoji]🚀
$CZR aka $ERI - COME BACK KING! Hasnt been this cheap since 2017. THIS SHOULD RUN UP to $35-$38 shortly. Biggest casino/hotel chain in the WORLD after buying out caesars. Should be $70-$100 ticker by 2030-2035💎🔥⚠️👀 [Rocket Emoji]🚀
$O - MONTHLY dividend. [5% yearly] GREAT LONG term investment. 💎
$JMIA - Monthly MACD Setup so perfectly for this, Has been running lately but no where near pre-rona levels. HOPING FOR A SELL OFF TO TAKE A POSITION. Offering at $8.59 BUT its a shelf offering which means they don't have to sell it currently. This could drop down to that or continue its run until the offering block is dropped.👀⚠️🔥
$CNTG - Around 80-90% shareholders BREAKEVEN or at a LOSS!600 USA school+3 german airports so far.US mobile semi truck lab. So oversold its asking for change!🔥[Rocket Emoji]🚀
$WIMI - $8 OFFERING. I LOVE OFFERING plays without mass dillution<3 🔥💎 👀[Rocket Emoji]🚀
$SPAQ - Tons of pre-orders aka free revenue without advertising. This should take off like NKLA did eventually. 4hr chart approaching oversold. 94% Of shareholders at breakeven or loss! $10.60 is a strong af support. $13.95 is the first real resistance. If this breaks the $12.45/50 range SUPER bullish. Fisker dropping mad PR Hints on twitter 🔥👀[Rocket Emoji]🚀
🔥🌾Gold/Silver🌾🔥
$AGC - 2x silver. Aka silver -1% AGC -2%. This is a day or swing trade. Depreciates🔥
$SLV - Long term silver hold🔥 $JNUG - 2x Gold/Silver Junior Miners 🔥
$NUGT - 2x Gold/Silver Miners🔥
$GLD - Long term gold holds👀🔥
🔮BET AGAINST THE MARKET🔮
$SPXS - 3X Inverse of SPY [The overall market] Spy +1% SPXS -3%. Spy -3% SPXS +9%
$VXX - Fear index/Volatility Index. This goes up with market feaunsurity. USUALLY inverses $SPY🚀🚀
Newfilter.io [USE THIS SITE, LOVE THIS SITE, BEFRIEND THIS SITE] It gives live news [1-5mins delayed]. I refresh the FDA approval constantly and the latest news pretty often
PS. I have CALLS for $VXX [I believe market volatility/unsurity is going to SPIKE high as hell this week the longer the feds take with unemployment stimulus and the stimulus in general]
I have put spreads on $SPY I believe $SPY is going to drop for the above reason
submitted by Philly19111 to pennystocks [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Coil Whine - Unique Situation and What I've Learned and my Desparate Need for Help.

Specs listed at the bottom before you pull your hair out and throw your chair out the window.

So for the past 2 months I have been digging all over the internet and troubleshooting this problem in every way I can conceive and I have been through quite the journey to get where I am now, only to find that I may literally be the only one suffering from my unique problem. I am going to be somewhat detailed so that anyone else suffering from this might find this post and learn something (if we find a solution).
I will try to keep it concise, but I need you all to know what I have and have not tried so that we don't waste everyone's time.
I have an audio buzz. This buzz comes primarily from analogue ins/outs on my PC's hardware. USB audio ins/outs have this as well, but not nearly as bad. I have a USB mixer that I thought was the culprit, because as I was setting up the audio system for streaming it became apparent. I initially discovered ground loops and tried to mitigate the problem by eliminating that. No dice. I systematically eliminated every single ground from the system and removed components to no avail.
It would literally be impossible for me to have a ground loop with my current setup - I really dialed that in before I moved inside the PC. Yes I have even plugged the entire system (AS A TEMPORARY - LITERALLY 30 SECOND TEST) into the outlet with no ground prong (bring on the hate) to eliminate that possibility.
The main problem that I have is due to the fact that I have to monitor "listen to this device" one input or another with the way my audio works. I need on the fly control of multiple audio streams at my mixer, so I have audio running from windows into my mixer and back out at 2 points. If I want to hear anything from one of them I MUST monitor it within windows. Monitoring the USB audio source does make things significantly more quiet than monitoring the analogue line in, so I am setup this way and things are better than they could be - but still not nearly acceptable.
Spoiler: it is due to coil whine which apparently to every single other person in the world is literally a zero issue because they can hide their PC below their desk, keep the culprit component enclosed in the case or use good headphones and not have to listen to the "hardware coil whine." Nobody hears their coil whine through their audio output. If they do - they've been searching for solutions to:
These people that are searching this DO get the help they need. They simply disable a culprit unused audio source, disable monitoring "listen to this device" on an audio source, or reduce microphone boost or lower input/output levels. Some even have success disabling or enabling drivers (but I think this is not the ACTUAL solution - I notice that when I disable, uninstall or update devices/drivers, settings roll back too and any device I was monitoring is no longer monitored (or is monitored by the wrong audio output). My theory is that drivers have nothing to do with this problem - any apparent fix or genesis of the problem due to Windows Update or Drivers are actually just settings being defaulted or change by the audio system resetting.
I have also tried USB isolation and dedicated sound cards (which just pass the problem along). The problem is exactly the same no matter what because again - this is due to coil whine and it is at the hardware level at its core.
I discovered that it was coil whine after thinking I had discovered it was not coil whine. After all - removing my GPU from the equation didn't stop the sound from persisting in my headphones and a CPU can't coil whine (I don't think)... Anyways, I happen to think I have found a workaround last night. Yeah, sure - the buzz is still there but I am pretty sure it is not coming into my stream. Wrong. I load up a game (and I have my case side panel off) and before I can get into my headphones to check if the noise is back I notice it coming from inside my PC's case.
Quick throw-on of the headphones and a quick diagnostic tells me that indeed I am hearing the same noise inside the case and through my headphones. As mentioned before - the USB monitoring has lessened the problem, but not eliminated it.
So I have a big "HELL YEAH" moment. The problem is still there - but I know it is SOMEWHERE in this chunk of hardware I am looking at in front of me, and I can assume it is either the PSU, the Motherboard or the GPU.
So I take to doing some testing. In my months of research I found that when the computer is "drawing" as in pixels are generating new information, the problem is worse. I also know that loading my CPU to 100% significantly reduces the noise it is making and again I know these things can be related to changes in voltage at the CPU/GPU.
So I get a game loaded and go to work. Unplug Display Port - nothing changes significantly, but there is a small change nonetheless. But the monitor literally isn't drawing anything. The CPU is still relaying information (mouse position, the Game, etc). So either way the GPU is still receiving information, just not passing it on to the monitor.
Pull the 8 pin off the GPU - Fan cranks to 650% and I couldn't hear anything if I tried. So no dice there but I remember trying this before and not noticing much of a change either.
So now I open performance monitor, a web page with plenty of white on it (seems to generate the most noise) and start scrolling around. I notice that I get spikes on the GPU AND THE CPU when scrolling, and the noise in the headphones and at the hardware level is consistent with the movement and the readouts in Performance Monitor.
I run Cinebench r20, the CPU shuts the F**k up for the most part, but mostly because it is a high frequency now and most of it is out of normal hearing range (I have a wider hearing range due to ear training) and can pick up the low end of it (18-19khz) and think that if only this was all I had to deal with that would be great.
However, I am still getting quick spikes (during r20 test) when I move the mouse to highlight different tables on the performance monitor - so the GPU is also in on it.
Speaking of trying to isolate hardware problems: I have tried isolating the noise in the case using a straw and a notebook to block the sound and really can't determine if it is GPU, CPU, or some component on the motherboard or all three - I know it is not coming from the PSU because that is easy enough to isolate in my case (pun not intended - but enjoyed).
However, just because the PSU does not whine doesn't mean it isn't the culprit - if it is delivering unstable power to a component then it sure could be (correct me if I am wrong).
So here I am - wondering if you all have any valuable input. Please consider that I have read (no exaggeration) 200+ unique pages on this topic (broad as it was in the beginning) and I have tried everything suggested BESIDES replacing CPU, replacing, GPU, replacing MOBO, replacing PSU.
And that is why I am here asking for your advice. I need to probably replace components and I have to start somewhere - I cannot RMA anything besides the GPU (lost all proofs of purchase - paid cash for some items at retailers and lost paperwork when moving). And MSI will not RMA motherboards for Coil Whine anyway (according to numerous posts). I am prepared to buy a new MOBO and PSU, but I wonder where you think I should start.
Nvidia is looking into RMA'ing the card for me but they're hesitant.
I just want to list some other random things I have tried with no success so that you don't waste your time having to ask.
Please let me know if you have any input or are suffering the same problem. I would really appreciate it and hopefully someone suffering a problem can find this post and learn something about their own situation from all the processing I have done.

Specs:
Thanks in advance.

Update: In case this gets read by more than 3 people. Changed MOBO and PSU (independently and together - as separate tests) and nothing has changed.
submitted by oSHTsasQuatch to techsupport [link] [comments]

Update and Few Thoughts, a (Well-Typed) transcript: Liza&Charles the marketeers, Voltaire kick-off, PrisM and Ebb-and-Flow to fuck ETH2.0 Gasper, the (back)log of a man and a falcon, lots of companies, September Goguen time, Basho, 2021 Titans, Basho, Hydra and much more thoughts and prayers

Hi everybody this is Charles Hoskinson broadcasting live from warm sunny Colorado. I'm trying a new streaming service and it allows me to annotate a few things and simulcast to both periscope and youtube. Let's see how this works. I also get to put a little caption. I think for the future, I'm just for a while going to put: "I will never give away ada". So, when people repost my videos for giveaway scams they at least have that. First off, a thank you, a community member named Daryl had decided to carve a log and give his artistic impression of my twitter profile picture of me and the falcon so that always means a lot when I get these gifts from fans and also I just wanted to, on the back of the Catalyst presentation, express my profound gratitude and excitement to the community.
You know it's really really cool to see how much progress has been made in such a short period of time. It was only yesterday when we were saying "when Shelley"? Now Shelley's out and it's evolving rapidly. Voltaire is now starting to evolve rapidly and we're real close to Goguen. At the end of this month we'll be able to talk around some of the realities of Goguen and some of the ideas we have and give some dates for certain things and give you a sense of where that project is at. The good news is that we have gained an enormous amount of progress and knowledge about what we need to do and how to get that done and basically people are just executing and it's a much smaller task than getting us to Shelley. With Byron to Shelley we literally had to build a completely new cryptocurrency from the ground up. We had to have new ledger rules, new update system, we had to invent a way of transitioning from one system to another system and there's hundreds of other little innovations along the way: new network stack and so forth. Byron cosmetically looks like Shelley but under the hood it's completely different and the Shelley design was built with a lot of the things that we needed for Goguen in mind. For example, we built Shelley with the idea of extended UTXO and we built Shelley understanding what the realities were for the smart contract model and that's one of the advantages you get when you do this type of bespoke engineering. There's two consequences to that, one, the integration is significantly easier, and two, the integration is significantly faster. We won't look at that same complexity there.
The product update at the end of the month... We'll really start discussing around some of these things as well as talk about partners and talk about how the development ecosystem is going to evolve. There are a lot of threads throughout all three organizations that are happening simultaneously. Emurgo, they're really thinking deeply about DeFi and they've invited us to collaborate with them on things like stablecoins for example but we're also looking at oracles (oracle pools), DEX and these other things and because there are already people in market who have made mistakes, learned lessons, it gives us the benefit of hindsight. It means we can be much faster to market and we can build much more competitive things in market and the Cardano community gets first access to these next generation DeFi applications without a lot of the problems of the prior generations and that's super beneficial to us.
You know, the other side of it, is that Voltaire is going to have a systemic influence not just on community funding but also the overall evolution and direction of the platform. The longer it exists the more pervasive it will become. Probably first applied towards the Cardano foundation roadmap but later on it will definitely have a lot of influence and say over every element aspect of the system including the launch dApps and these other things. Basically, long term, the types of problems that Cardano solves so that's incredibly appealing to me and very exciting to me because it's like I have this giant community brain with the best and brightest of all of you working with us to get us where we need to go.
You know, another thing that was super encouraging, it's a small thing, but it shows us that we're definitely in the right direction was that we recently got a demo from Pramod (Viswanath) and his team out of university of Illinois on a protocol they create called PrisM which is a super fast proof-of-work protocol and they wrote this beautiful paper and they wrote code along with it that showed that PrisM is a ten thousand times faster than Nakamoto consensus. If you take the bitcoin proof-of-work protocol, you strip it out, you put PrisM in, you can run the entire bitcoin system 10000 times faster. They have these beautiful benchmarks to show that. Even in bad network conditions. (I'm) promoting this team, they're, they're real researchers, and they're real engineers, they use a lot of cool HPC concepts like springboarding and other things like that to accommodate that. Then I asked him in the presentation, I said well, how much faster if you replay the Ethereum chain? He says, well, that it takes a big performance hit, could be only maybe a hundred times because that model is not as easy to optimize and shard with standard computer science concepts. In fact in some cases there are limitations there that really can't be overcome. It turns out that we're more on that UTXO side than we are on the account side. As a coincidence or intent of the design of extended UTXO we're gonna have a lot easier time getting much higher performance where and when it's necessary.
I also approved this week a scaling up of the Basho project. In particular, to build a hydra prototype team. The science has gotten to a point where we can make a really competitive push in that particular direction. What does that mean? It means that in just a few short months we can de-risk technological approaches that long-term will give us a lot of fruit where and when the community decides that they need infrastructure like hydra. Now, here's the beautiful thing about hydra. If you watch my whiteboard back in September of 2017 when Cardano first hit market with Byron I talked about this concept of looking at scalability with a very simple test which is as you get more people in the system it stays at the same performance or it gets faster. We all experience systems that do this, for example, bittorrent, more people downloading something you tend to be able to get it faster and we all experience the converse which is, the system gets slower when you get more people. What does this mean? It means that hydra is an actual approach towards true scalability in the system and it's a lot easier to do than sharding even though we have a beautiful approach to get the sharding on the ledger side if we truly desire to go down that way. There's beautiful ideas that we are definitely in deep discussions about. That's a very complex thing. There was recently a paper ("Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma") out of Stanford that showed that the Gasper protocol as proposed for ETH2.0 does have some security concerns and it's going to be the burden on the shoulders of the Ethereum 2.0 developers and Vitalik to address those concerns from those Stanford professors. Whenever you have these very complex protocols they have so many different ways they can break and things can go wrong so it's much more appealing when you don't have to embrace complexity to achieve the same. The elegance of hydra is that stake pool operators are very natural parties to put hydra channels on and every time we add one we get much more performance out of that and the system as it gets more valuable. The k factor increases which means you get more stake pull operators, which means you get more hydra channels, so with growth we get appreciation, with appreciation we get more decentralization, with more decentralization we get more performance. In essence, this spiritually speaking, is really what we meant when we said scalability. That the system will always grow to meet its particular needs and we have a very elegant way of moving in that direction that doesn't require us to embrace very sophisticated techniques. It's not to say that these techniques don't have a place and purpose but it says that the urgency of implementing these is gone and we then have the luxury to pick the best science when it's ready instead of rushing it to market to resolve a crisis of high fees. We'll never have that crisis so there's a beauty to Cardano that is missing, I in my view, from many cryptocurrencies and blockchains in the marketplace and we're now seeing that beauty shine through. Not only through our community who are so passionate and amazing but in the science and the engineering itself and how easy it is for us to navigate the concepts. How easy it is for us to add more things, to take some things away, to clean some things up here and there and our ability to move through.
I never imagined when in 2015 I signed up to go in on this crazy ride and try to build a world financial operating system we would have made as much progress as we made today. We've written more than 75 research papers as an organization many of which are directly applicable to Cardano. We've got great partners who work with Nasa and Boeing and Pfizer, massive companies, that have 10 years of history and millions of users to come in and help us grow better. We've worked with incredible organizations, major universities like university of Wyoming, university of Edinburgh, Tokyo, tech professors all across the world. We've worked with incredible engineering firms like VacuumLabs and AtixLabs and Twig and Well-Typed, runtime verification, QuviQ and dozens of others along the years and despite the fact that at times there's been delays and friction throughout this entire journey we've mostly been aligned and we keep learning and growing. It gives me so much hope that our best days are ahead of us and an almost fanatical belief that success is inevitable in a certain respect. You see because we always find a way to be here tomorrow and we always find a way to make tomorrow a better day than today and as long as that's the trend you're monotonically increasing towards a better tomorrow, you're always going to have that outcome, you're always going to be in a position where Cardano shines bright. Towards the end of the month we'll have a lot more to say about the development side and that'll be a beginning just like Voltaire is the beginning and then suddenly you now notice the beautiful parallelism of the roadmap. Shelley continues to evolve, partial delegation is coming, in fact, I signed the contract with vacuumlabs to bring that to Ledger (and Trezor). The Daedalus team is hard at work to make that feature apparent for everyone as is the Yoroi team.
You see that, with now Voltaire, and soon was Goguen, and these are not endpoints, rather they're just beginnings and they're never over. We can always make staking better, more diverse, more merit-based and entertain different control models, have better delegation mechanics, have better user experience. The same for smart contracts, that's an endless river and along the way what we've discovered is it's easy for us to work with great minds and great people. For example with testing of smart contracts I would love to diversify that conversation above and beyond what we can come up with and bring in some firms who have done this for a long time to basically take that part with us shoulder to shoulder and build beautiful frameworks to assist us. For example, runtime verification is doing this with, the EVM with a beautiful project called Firefly to replace Truffle. I believe that we can achieve similar ends with Plutus smart contracts.
When you ask yourself what makes a system competitive in the cryptocurrency space? In my view there are four dimensions and you have to have a good story for all four of those dimensions. You need security and correctness. A lot of people don't prioritize that but when they get that wrong it hurts retail people, it hurts everyday people, billions of dollars have been lost due to the incompetence and ineptitude of junior developers making very bad mistakes and oftentimes those developers faced no consequences. The people who lost money were innocent people who believed in cryptocurrencies and wanted to be part of the movement but didn't protect themselves adequately. That's a really sad thing and it's unethical to continue pushing a model that that is the standard or the likely outcome rather than a rare edge case. You have to as a platform, a third generation platformn invest heavily in giving the developers proper tools to ensure security and correctness. We've seen a whole industry there's been great innovations out of Quantstamp and ConsenSys and dozens of other firms in the space including runtime verification who have really made major leaps in the last few years of trying to improve that story. What's unique to Cardano is that we based our foundations on languages that were designed right the first time and there's over 35 years of history for the approach that we're following in the Haskell side that allows us to build high assurance systems and our developers in the ecosystem to build high assurance systems. We didn't reinvent the wheel, we found the best wheel and we're giving it to you.
I think we're going to be dominant in that respect as we enter 2021. Second, you look at things like ease of maintenance, ease of deployment, the life cycle of the software upgrades to the software and as we've demonstrated with things like the hard fork combinator and the fact that Voltaire is not just a governance layer for ada and Cardano but will eventually be reusable for any dApp deployed on our system. You have very natural tooling that's going to allow people to upgrade their smart contracts, their dApps and enable governance for their users at an incredibly low cost and not have to reinvent the governance wheel each and every application. This is another unique property to our system and it can be reused for the dApps that you deploy on your system as I've mentioned before. Performance is a significant concern and this was often corrupted by marketers especially ICO marketers who really wanted to differentiate (and) say: "our protocol tested on a single server in someone's basement is 500000 transactions per second" and somehow that translates to real life performance and that's antithetical to anyone who's ever to study distributed systems and understands the reality of these systems and where they go and what they do and in terms of performance. I think we have the most logical approach. You know, we have 10 years of history with bitcoin, it's a massive system, we've learned a huge amount and there's a lot of papers written about, a lot of practical projects and bitcoin is about to step into the world of smart contracts. We congratulate them on getting Schnorr sigs in and the success of Taproot. That means entering 2021, 2022, we are going to start seeing legitimate dApps DeFi projects, real applications, instead of choosing Ethereum or Algorand, EOS, Cardano, choosing bitcoin and they're adding a lot to that conversation. I think that ultimately that model has a lot of promise which is why we built a better one. There are still significant limitations with what bitcoin can accomplish from settlement time to the verbosity of contracts that can be written.
The extended UTXO model was designed to be the fastest accounting and most charitable accounting model ever, on and off chain, and hydra was designed to allow you to flex between those two systems seamlessly. When you look at the foundations of where we're at and how we can extend this from domain specific languages, for domain experts, such as Marlowe to financial experts, and the DSLs that will come later, for others, like lawyers and supply chain experts in medical databases and so forth and how easy it is to write and deploy these. Plutus being beautiful glue code for both on and off chain communications. I think we have an incredibly competitive offering for performance and when hydra comes, simply put, there'll be no one faster. If we need to shard, we're going to do that and definitely better than anybody else because we know where our security model sits and there won't be surprise Stanford papers to blindside us that require immediate addressing.
In terms of operating costs, this is the last component, in my view, and that's basically how much does it cost you the developer to run your application? There are really two dimensions, one is predictability and the other is amount. It's not just good enough to say: it's a penny per transaction today. You need to know that after you spend millions of dollars and months or years of effort building something and deploying something that you're not going to wake up tomorrow and now it's five dollars to do what used to cost a penny. You need that cost to be as low as possible and as predictable as possible and again the way that we architectured our system and as we turn things on towards the end of this year and as we enter into the next year we believe we have a great approach to achieve low operating cost. One person asks why Cardano? Well because we have great security and correctness in the development experience and tools with 35 years of legacy that were built right the first time and don't put the burdens of mistakes on your customers. They ask why Cardano and we say: well the chain itself is going to give you great solutions with identity value transformation and governance itself and as a consequence when you talk about upgrading your applications having a relationship with your customers of your applications and you talk about the ease of maintenance of those applications. There's going to be a good story there and we have beautiful frameworks like Voltaire that allow that story to evolve and we keep adding partners and who have decades of experience to get us along. We won't stop until it's much better. They asked why Cardano? We said because at the moment we're 10 times faster today than Ethereum today and that's all we really need for this year and next year to be honest and in the future we can be as fast as we need to be because we're truly scalable. As the system gets more decentralized the system improves performance and where and when we need to shard we can do that. We'll have the luxury of time to do it right, the Cardano way, and when people ask why Cardano? Because the reality is, it's very cheap to do things on our platform and the way we're building things. That's going to continue being the case and we have the governance mechanisms to allow the community to readjust fees and parameters so that it can continue being affordable for users. Everything in the system will eventually be customizable and parameterizable: from block size, to transaction fees and the community will be in a good position to dynamically allocate these things where and when needed so that we can enjoy as an ecosystem predictability in our cost.
In the coming weeks and months, especially in my company, we're going to invest a lot of time and effort into comparison marketing and product marketing. When I see people say, oh well, you've launched proof of stake, a lot of other people have done. I don't think those people fully appreciate the magnitude of what we actually accomplished as an ecosystem and the quality of the protocols that are in distribution. That's not their fault, it's our fault, because we didn't take the time in simplistic terms, not scientific papers and deep code and formal specifications, but rather everyday language, to really show why we're different. I admit that that's a product failing and that needs to be corrected so we hired a great marketing director, named Liza (Horowitz?) and she is going to work full time with me and others in the ecosystem, a great team of people, every single day to get out there and explain what we have done is novel, unique, competitive and special to our industry. Everything from Ouroboros and contrast to major other protocols from the EOSes and Algorands and the Tezos of the world. Why we're different, trade-offs we chose over them, to our network stack, to the extended UTXO model, to Plutus, to Marlowe and we're going to keep hammering away at that until we get it right and everybody acknowledges and sees what has been accomplished.
I've spent five years of my life, good years of my life, and missed a lot to get this project where it needs to go. All of our employees have invested huge sums of their personal lives, their time, their brand, their careers, in trying to make this the really most magical and special cryptocurrency and blockchain infrastructure around. No one ever signed up in this company or the other companies working on Cardano to work on a mediocre protocol. That's just another blockchain, they signed up to change the world, they signed up to build a system that legitimately can look at you in the face and say: one day we have the potential to have a billion users! That's what they signed up for and they showed up to play. They built technology that evolves in that direction with some certainty and great foundations and we have an obligation to market in a way that can show the world why, succinctly, with clarity. Understandably, this has been a failing in the past but you know what? You can always be better tomorrow that monotonically increasing make it better and that's what we're going to do. We recognized it and we're going to invest in it and with Voltaire if we can't do it. You the community can do it and we'll work with you. If you can do a better job and the funding will be there to get that done. In addition to this, we think about 2021 and we ask where does the future take us? I've thought a lot about this you know I've thought a lot about how do we get the next five years as we close out 2020 and here's the reality: we're not going to leave as a company until we have smart contracts and multi-asset and Voltaire has evolved to a point where the community can comfortably make decisions about the future of the protocol and that the staking experience has solidified and it's stable.
I don't care if this costs me millions or tens of millions of dollars out of my own pocket to make happen. I'm going to do that because that's my commitment to you, the community and every product update will keep pushing our way there. We'll continue to get more transparent, we'll continue to get more aggressive and hire more and parallelize more. Aware when we can, to deliver that experience so that Cardano gets where it needs to go. Then when we ask about where do we go next? The reality is that the science as an industry, the engineering as an industry has given a menu of incredibly unique attractive and sexy things that we can pursue. What we're going to do is work with the community and the very same tools that are turning on today, the Voltaire tools, the cardano.ideascale.com tools and we're going to propose a consortium and we're going to bring the best and brightest together and give a vision of where we can take the system in another five years. With the benefit of hindsight, massively improved processes, better estimation capabilities and the fact that we're not starting with two people at IOG. We're starting with 250 people and the best scientific division in our industry and the legacy of almost, nearly by the end of this year, 100 scientific papers. That's us, you know what, there's dozens of companies throughout the history who have worked on Cardano. It's about time to scale them up too and get client diversity. So come next year when the protocol has evolved to the point where it's ready for it, we'll have that conversation with you the community and that's going to be a beautiful conversation. At the conclusion of it, there's going to be certainty of how we're going to evolve over the next five years to get ourselves beyond the cryptocurrency space. I'm very tired of these conversations we have about: are you going to go to (coindesk's) consensus or not? Or who's going to be the big winner? What about Libra or what about this particular regulation and this crypto unicorn and this thing?
You know I've been in the space a long time and I've noticed that people keep saying the same things year after year in the same venues. Yes, the crowd sizes get larger and the amount of value at risk gets larger but I haven't seen a lot of progress in the places where I feel it is absolutely necessary for this technology to be permanent in the developing world. We need to see economic identity. People often ask what is the mission for Cardano? For us IOG, you look at economic identity and you take a look at a roadmap. For it, you scale up and down, and each and every step along the way, from open data, to self-sovereign identity, to financial inclusion. You can keep going down: to decentralized lending, decentralized insurance, decentralized banking. Each and every step along the way to economic identity. When you admit a blockchain tells you that, there's a collection of applications and infrastructure that you need to build.
My life's work is to get to a point where we have the technology to do that. The infrastructure to do that, with principles, and so we'll keep evolving Cardano and we'll keep evolving the space as a whole and the science as a whole until I can wake up and say: each box and that road to economic identity, for all people not just one group, we have a solution for that. I'm going to put those applications on Cardano and success for me is not about us being king of the crypto hill and having a higher market cap than bitcoin or being entrepreneur of the year coindesk's most influential person. It's meaningless noise, success for me is reflecting back at the things that we have accomplished together and recognizing that millions if not billions now live in a system where they all matter, they all have a voice, they all have an equal footing. The Jeff Bezos of the world have the very same experience as the person born in Rwanda and we're not done until that's the case. It's a long road, it's a hard road, but you know what? We're making progress, we have great people in Africa, we have great people in eastern Europe, we have great people in southeast Asia and great partners all along the way. Great people, Latin America, great people in south America, great people here in the United States.
When we talk about economic identity there are millions, if not tens of millions of Americans who don't have it. Same for Canadians, hundreds of thousands, who don't have it. Developed western cultures, it's the greatest blind spot of policy and as we enter into a depression as a result of coronavirus, add millions if not tens of millions more onto that list. Generations are being disenfranchised by this legacy system and we as an ecosystem, we as an entire community are offering a different way forward. Not hyper centralizationn not social credit but a way forward where you own your own money, your own identity, your own data. You're not a victim of surveillance capitalism, you're not a victim of civil asset forfeiture. When you say the wrong things, you get shut out of society. Each and every human being matters and I'm optimistic to believe that when you remind people that they matter they're gonna rise to the occasion. That is the point of my company. In the things that we do each and every day, that's our mission to give the platforms to the world so that those who don't have economic identity can get it and they can keep it and no one can take it from them and they can enjoy an ever increasing growth of standard of living wealth and prosperity.
However you want to measure that this is my goal post, I couldn't care less about the cryptocurrency space. It was a great place to start but the space needs to be reminded why it exists. Bitcoin was given a mandate on the back of the 2008 financial crisis to do something different. It was not given a mandate to go be a new settlement layer for central banks or a new way for the old guard to make more money and banks get bigger and for those who are in control to preserve their power. The whole point of doing something so crazy as to buy a coin that doesn't even exist in real life, that's just a bunch of numbers in the cloud, the whole point of that was so that we as a society could do something different than the way that we'd been doing things before. So, each and every member of the cryptocurrency space needs to remind everyone else from time to time why we're here and where did we come from and where are we going to go.
The beauty of Cardano is we have already achieved for the most part a decentralized brain and that momentum is pushing harder than ever. More and more scientists are waking up, more and more institutions are waking up, getting us there. The code we have, the right approach and I think we have a great competitive offering for 2021 as we go and battle the titans and that's going to be a lot of fun but we know who we are and where we're going and we're in the right places. It's so incredibly encouraging to see the stake pool operators not just be from California or Texas or New York or Canada. To see a lot of stake pool operators from the place that need the most, help everybody does matter and it means a lot to me for the people who are there but it means a lot to everybody to say that we have created an equal platform. It makes the participation of all of us so much more meaningful. We're not just talking to each other, we're talking to the world and by working together on this platform we're lifting the world up and giving people hope. That's the point, there's a lot more to do, we didn't get everything done. You never do you aspire, you work hard, you set a moon, shot and sometimes you can just get to orbit with the first go but you know what? When you build the next rocket you can go to Mars.
Thank you all for being with me, thank you all for being part of this. Today was a damn good day with the announcement of Voltaire. Go to cardano.ideascale.com. You can participate in that, so end of September is going to be a good day too. There's a lot of good days to come, in between a lot of hard days, doing tasks sometimes entirely forgettable but always necessary to keep the revolution going and the movement going. I cannot wait for 2021, our best days are ahead of us, because of you. You all take care now .
Source: https://www.youtube.com/watch?v=BFa9zL_Dl_w
Other things mentioned:
https://cardano.ideascale.com/
https://www.atixlabs.com/blockchain
https://www.well-typed.com/
https://www.vacuumlabs.com/
https://medium.com/interdax/what-is-taproot-and-how-will-it-benefit-bitcoin-5c8944eed8da
https://medium.com/interdax/how-will-schnorr-signatures-benefit-bitcoin-b4482cf85d40
https://quantstamp.com/
https://bloxian.com/bloxian-platforms/ (TWIG)
https://runtimeverification.com/firefly/
https://www.trufflesuite.com/
https://experts.illinois.edu/en/publications/prism-deconstructing-the-blockchain-to-approach-physical-limits (PrisM and not our Prism https://atalaprism.io/)
Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma (aka Gasper and ETH2.0 fucker) https://arxiv.org/abs/2009.04987
http://www.quviq.com/products/
https://en.wikipedia.org/wiki/Schnorr_signature
submitted by stake_pool to cardano [link] [comments]

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

How EpiK Protocol “Saved the Miners” from Filecoin with the E2P Storage Model?

https://preview.redd.it/n5jzxozn27v51.png?width=2222&format=png&auto=webp&s=6cd6bd726582bbe2c595e1e467aeb3fc8aabe36f
On October 20, Eric Yao, Head of EpiK China, and Leo, Co-Founder & CTO of EpiK, visited Deep Chain Online Salon, and discussed “How EpiK saved the miners eliminated by Filecoin by launching E2P storage model”. ‘?” The following is a transcript of the sharing.
Sharing Session
Eric: Hello, everyone, I’m Eric, graduated from School of Information Science, Tsinghua University. My Master’s research was on data storage and big data computing, and I published a number of industry top conference papers.
Since 2013, I have invested in Bitcoin, Ethereum, Ripple, Dogcoin, EOS and other well-known blockchain projects, and have been settling in the chain circle as an early technology-based investor and industry observer with 2 years of blockchain experience. I am also a blockchain community initiator and technology evangelist
Leo: Hi, I’m Leo, I’m the CTO of EpiK. Before I got involved in founding EpiK, I spent 3 to 4 years working on blockchain, public chain, wallets, browsers, decentralized exchanges, task distribution platforms, smart contracts, etc., and I’ve made some great products. EpiK is an answer to the question we’ve been asking for years about how blockchain should be landed, and we hope that EpiK is fortunate enough to be an answer for you as well.
Q & A
Deep Chain Finance:
First of all, let me ask Eric, on October 15, Filecoin’s main website launched, which aroused everyone’s attention, but at the same time, the calls for fork within Filecoin never stopped. The EpiK protocol is one of them. What I want to know is, what kind of project is EpiK Protocol? For what reason did you choose to fork in the first place? What are the differences between the forked project and Filecoin itself?
Eric:
First of all, let me answer the first question, what kind of project is EpiK Protocol.
With the Fourth Industrial Revolution already upon us, comprehensive intelligence is one of the core goals of this stage, and the key to comprehensive intelligence is how to make machines understand what humans know and learn new knowledge based on what they already know. And the knowledge graph scale is a key step towards full intelligence.
In order to solve the many challenges of building large-scale knowledge graphs, the EpiK Protocol was born. EpiK Protocol is a decentralized, hyper-scale knowledge graph that organizes and incentivizes knowledge through decentralized storage technology, decentralized autonomous organizations, and generalized economic models. Members of the global community will expand the horizons of artificial intelligence into a smarter future by organizing all areas of human knowledge into a knowledge map that will be shared and continuously updated for the eternal knowledge vault of humanity
And then, for what reason was the fork chosen in the first place?
EpiK’s project founders are all senior blockchain industry practitioners and have been closely following the industry development and application scenarios, among which decentralized storage is a very fresh application scenario.
However, in the development process of Filecoin, the team found that due to some design mechanisms and historical reasons, the team found that Filecoin had some deviations from the original intention of the project at that time, such as the overly harsh penalty mechanism triggered by the threat to weaken security, and the emergence of the computing power competition leading to the emergence of computing power monopoly by large miners, thus monopolizing the packaging rights, which can be brushed with computing power by uploading useless data themselves.
The emergence of these problems will cause the data environment on Filecoin to get worse and worse, which will lead to the lack of real value of the data in the chain, high data redundancy, and the difficulty of commercializing the project to land.
After paying attention to the above problems, the project owner proposes to introduce multi-party roles and a decentralized collaboration platform DAO to ensure the high value of the data on the chain through a reasonable economic model and incentive mechanism, and store the high-value data: knowledge graph on the blockchain through decentralized storage, so that the lack of value of the data on the chain and the monopoly of large miners’ computing power can be solved to a large extent.
Finally, what differences exist between the forked project and Filecoin itself?
On the basis of the above-mentioned issues, EpiK’s design is very different from Filecoin, first of all, EpiK is more focused in terms of business model, and it faces a different market and track from the cloud storage market where Filecoin is located because decentralized storage has no advantage over professional centralized cloud storage in terms of storage cost and user experience.
EpiK focuses on building a decentralized knowledge graph, which reduces data redundancy and safeguards the value of data in the distributed storage chain while preventing the knowledge graph from being tampered with by a few people, thus making the commercialization of the entire project reasonable and feasible.
From the perspective of ecological construction, EpiK treats miners more friendly and solves the pain point of Filecoin to a large extent, firstly, it changes the storage collateral and commitment collateral of Filecoin to one-time collateral.
Miners participating in EpiK Protocol are only required to pledge 1000 EPK per miner, and only once before mining, not in each sector.
What is the concept of 1000 EPKs, you only need to participate in pre-mining for about 50 days to get this portion of the tokens used for pledging. The EPK pre-mining campaign is currently underway, and it runs from early September to December, with a daily release of 50,000 ERC-20 standard EPKs, and the pre-mining nodes whose applications are approved will divide these tokens according to the mining ratio of the day, and these tokens can be exchanged 1:1 directly after they are launched on the main network. This move will continue to expand the number of miners eligible to participate in EPK mining.
Secondly, EpiK has a more lenient penalty mechanism, which is different from Filecoin’s official consensus, storage and contract penalties, because the protocol can only be uploaded by field experts, which is the “Expert to Person” mode. Every miner needs to be backed up, which means that if one or more miners are offline in the network, it will not have much impact on the network, and the miner who fails to upload the proof of time and space in time due to being offline will only be forfeited by the authorities for the effective computing power of this sector, not forfeiting the pledged coins.
If the miner can re-submit the proof of time and space within 28 days, he will regain the power.
Unlike Filecoin’s 32GB sectors, EpiK’s encapsulated sectors are smaller, only 8M each, which will solve Filecoin’s sector space wastage problem to a great extent, and all miners have the opportunity to complete the fast encapsulation, which is very friendly to miners with small computing power.
The data and quality constraints will also ensure that the effective computing power gap between large and small miners will not be closed.
Finally, unlike Filecoin’s P2P data uploading model, EpiK changes the data uploading and maintenance to E2P uploading, that is, field experts upload and ensure the quality and value of the data on the chain, and at the same time introduce the game relationship between data storage roles and data generation roles through a rational economic model to ensure the stability of the whole system and the continuous high-quality output of the data on the chain.
Deep Chain Finance:
Eric, on the eve of Filecoin’s mainline launch, issues such as Filecoin’s pre-collateral have aroused a lot of controversy among the miners. In your opinion, what kind of impact will Filecoin bring to itself and the whole distributed storage ecosystem after it launches? Do you think that the current confusing FIL prices are reasonable and what should be the normal price of FIL?
Eric:
Filecoin mainnet has launched and many potential problems have been exposed, such as the aforementioned high pre-security problem, the storage resource waste and computing power monopoly caused by unreasonable sector encapsulation, and the harsh penalty mechanism, etc. These problems are quite serious, and will greatly affect the development of Filecoin ecology.
These problems are relatively serious, and will greatly affect the development of Filecoin ecology, here are two examples to illustrate. For example, the problem of big miners computing power monopoly, now after the big miners have monopolized computing power, there will be a very delicate state — — the miners save a file data with ordinary users. There is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. And after the big miners have monopolized computing power, there will be a very delicate state — — the miners will save a file data with ordinary users, there is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. Because I can fake another identity to upload data for myself, but that leads to the fact that for any miner I go to choose which data to save. I have only one goal, and that is to brush my computing power and how fast I can brush my computing power.
There is no difference between saving other people’s data and saving my own data in the matter of computing power. When I save someone else’s data, I don’t know that data. Somewhere in the world, the bandwidth quality between me and him may not be good enough.
The best option is to store my own local data, which makes sense, and that results in no one being able to store data on the chain at all. They only store their own data, because it’s the most economical for them, and the network has essentially no storage utility, no one is providing storage for the masses of retail users.
The harsh penalty mechanism will also severely deplete the miner’s profits, because DDOS attacks are actually a very common attack technique for the attacker, and for a big miner, he can get a very high profit in a short period of time if he attacks other customers, and this thing is a profitable thing for all big miners.
Now as far as the status quo is concerned, the vast majority of miners are actually not very well maintained, so they are not very well protected against these low-DDOS attacks. So the penalty regime is grim for them.
The contradiction between the unreasonable system and the demand will inevitably lead to the evolution of the system in a more reasonable direction, so there will be many forked projects that are more reasonable in terms of mechanism, thus attracting Filecoin miners and a diversion of storage power.
Since each project is in the field of decentralized storage track, the demand for miners is similar or even compatible with each other, so miners will tend to fork the projects with better economic benefits and business scenarios, so as to filter out the projects with real value on the ground.
For the chaotic FIL price, because FIL is also a project that has gone through several years, carrying too many expectations, so it can only be said that the current situation has its own reasons for existence. As for the reasonable price of FIL there is no way to make a prediction because in the long run, it is necessary to consider the commercialization of the project to land and the value of the actual chain of data. In other words, we need to keep observing whether Filecoin will become a game of computing power or a real value carrier.
Deep Chain Finance:
Leo, we just mentioned that the pre-collateral issue of Filecoin caused the dissatisfaction of miners, and after Filecoin launches on the main website, the second round of space race test coins were directly turned into real coins, and the official selling of FIL hit the market phenomenon, so many miners said they were betrayed. What I want to know is, EpiK’s main motto is “save the miners eliminated by Filecoin”, how to deal with the various problems of Filecoin, and how will EpiK achieve “save”?
Leo:
Originally Filecoin’s tacit approval of the computing power makeup behavior was to declare that the official directly chose to abandon the small miners. And this test coin turned real coin also hurt the interests of the loyal big miners in one cut, we do not know why these low-level problems, we can only regret.
EpiK didn’t do it to fork Filecoin, but because EpiK to build a shared knowledge graph ecology, had to integrate decentralized storage in, so the most hardcore Filecoin’s PoRep and PoSt decentralized verification technology was chosen. In order to ensure the quality of knowledge graph data, EpiK only allows community-voted field experts to upload data, so EpiK naturally prevents miners from making up computing power, and there is no reason for the data that has no value to take up such an expensive decentralized storage resource.
With the inability to make up computing power, the difference between big miners and small miners is minimal when the amount of knowledge graph data is small.
We can’t say that we can save the big miners, but we are definitely the optimal choice for the small miners who are currently in the market to be eliminated by Filecoin.
Deep Chain Finance:
Let me ask Eric: According to EpiK protocol, EpiK adopts the E2P model, which allows only experts in the field who are voted to upload their data. This is very different from Filecoin’s P2P model, which allows individuals to upload data as they wish. In your opinion, what are the advantages of the E2P model? If only voted experts can upload data, does that mean that the EpiK protocol is not available to everyone?
Eric:
First, let me explain the advantages of the E2P model over the P2P model.
There are five roles in the DAO ecosystem: miner, coin holder, field expert, bounty hunter and gateway. These five roles allocate the EPKs generated every day when the main network is launched.
The miner owns 75% of the EPKs, the field expert owns 9% of the EPKs, and the voting user shares 1% of the EPKs.
The other 15% of the EPK will fluctuate based on the daily traffic to the network, and the 15% is partly a game between the miner and the field expert.
The first describes the relationship between the two roles.
The first group of field experts are selected by the Foundation, who cover different areas of knowledge (a wide range of knowledge here, including not only serious subjects, but also home, food, travel, etc.) This group of field experts can recommend the next group of field experts, and the recommended experts only need to get 100,000 EPK votes to become field experts.
The field expert’s role is to submit high-quality data to the miner, who is responsible for encapsulating this data into blocks.
Network activity is judged by the amount of EPKs pledged by the entire network for daily traffic (1 EPK = 10 MB/day), with a higher percentage indicating higher data demand, which requires the miner to increase bandwidth quality.
If the data demand decreases, this requires field experts to provide higher quality data. This is similar to a library with more visitors needing more seats, i.e., paying the miner to upgrade the bandwidth.
When there are fewer visitors, more money is needed to buy better quality books to attract visitors, i.e., money for bounty hunters and field experts to generate more quality knowledge graph data. The game between miners and field experts is the most important game in the ecosystem, unlike the game between the authorities and big miners in the Filecoin ecosystem.
The game relationship between data producers and data storers and a more rational economic model will inevitably lead to an E2P model that generates stored on-chain data of much higher quality than the P2P model, and the quality of bandwidth for data access will be better than the P2P model, resulting in greater business value and better landing scenarios.
I will then answer the question of whether this means that the EpiK protocol will not be universally accessible to all.
The E2P model only qualifies the quality of the data generated and stored, not the roles in the ecosystem; on the contrary, with the introduction of the DAO model, the variety of roles introduced in the EpiK ecosystem (which includes the roles of ordinary people) is not limited. (Bounty hunters who can be competent in their tasks) gives roles and possibilities for how everyone can participate in the system in a more logical way.
For example, a miner with computing power can provide storage, a person with a certain domain knowledge can apply to become an expert (this includes history, technology, travel, comics, food, etc.), and a person willing to mark and correct data can become a bounty hunter.
The presence of various efficient support tools from the project owner will lower the barriers to entry for various roles, thus allowing different people to do their part in the system and together contribute to the ongoing generation of a high-quality decentralized knowledge graph.
Deep Chain Finance:
Leo, some time ago, EpiK released a white paper and an economy whitepaper, explaining the EpiK concept from the perspective of technology and economy model respectively. What I would like to ask is, what are the shortcomings of the current distributed storage projects, and how will EpiK protocol be improved?
Leo:
Distributed storage can easily be misunderstood as those of Ali’s OceanDB, but in the field of blockchain, we should focus on decentralized storage first.
There is a big problem with the decentralized storage on the market now, which is “why not eat meat porridge”.
How to understand it? Decentralized storage is cheaper than centralized storage because of its technical principle, and if it is, the centralized storage is too rubbish for comparison.
What incentive does the average user have to spend more money on decentralized storage to store data?
Is it safer?
Existence miners can shut down at any time on decentralized storage by no means save a share of security in Ariadne and Amazon each.
More private?
There’s no difference between encrypted presence on decentralized storage and encrypted presence on Amazon.
Faster?
The 10,000 gigabytes of bandwidth in decentralized storage simply doesn’t compare to the fiber in a centralized server room. This is the root problem of the business model, no one is using it, no one is buying it, so what’s the big vision.
The goal of EpiK is to guide all community participants in the co-construction and sharing of field knowledge graph data, which is the best way for robots to understand human knowledge, and the more knowledge graph data there is, the more knowledge a robot has, the more intelligent it is exponentially, i.e., EpiK uses decentralized storage technology. The value of exponentially growing data is captured with linearly growing hardware costs, and that’s where the buy-in for EPK comes in.
Organized data is worth a lot more than organized hard drives, and there is a demand for EPK when robots have the need for intelligence.
Deep Chain Finance:
Let me ask Leo, how many forked projects does Filecoin have so far, roughly? Do you think there will be more or less waves of fork after the mainnet launches? Have the requirements of the miners at large changed when it comes to participation?
Leo:
We don’t have specific statistics, now that the main network launches, we feel that forking projects will increase, there are so many restricted miners in the market that they need to be organized efficiently.
However, we currently see that most forked projects are simply modifying the parameters of Filecoin’s economy model, which is undesirable, and this level of modification can’t change the status quo of miners making up computing power, and the change to the market is just to make some of the big miners feel more comfortable digging up, which won’t help to promote the decentralized storage ecology to land.
We need more reasonable landing scenarios so that idle mining resources can be turned into effective productivity, pitching a 100x coin instead of committing to one Fomo sentiment after another.
Deep Chain Finance:
How far along is the EpiK Protocol project, Eric? What other big moves are coming in the near future?
Eric:
The development of the EpiK Protocol is divided into 5 major phases.
(a) Phase I testing of the network “Obelisk”.
Phase II Main Network 1.0 “Rosetta”.
Phase III Main Network 2.0 “Hammurabi”.
(a) The Phase IV Enrichment Knowledge Mapping Toolkit.
The fifth stage is to enrich the knowledge graph application ecology.
Currently in the first phase of testing network “Obelisk”, anyone can sign up to participate in the test network pre-mining test to obtain ERC20 EPK tokens, after the mainnet exchange on a one-to-one basis.
We have recently launched ERC20 EPK on Uniswap, you can buy and sell it freely on Uniswap or download our EpiK mobile wallet.
In addition, we will soon launch the EpiK Bounty platform, and welcome all community members to do tasks together to build the EpiK community. At the same time, we are also pushing forward the centralized exchange for token listing.
Users’ Questions
User 1:
Some KOLs said, Filecoin consumed its value in the next few years, so it will plunge, what do you think?
Eric:
First of all, the judgment of the market is to correspond to the cycle, not optimistic about the FIL first judgment to do is not optimistic about the economic model of the project, or not optimistic about the distributed storage track.
First of all, we are very confident in the distributed storage track and will certainly face a process of growth and decline, so as to make a choice for a better project.
Since the existing group of miners and the computing power already produced is fixed, and since EpiK miners and FIL miners are compatible, anytime miners will also make a choice for more promising and economically viable projects.
Filecoin consumes the value of the next few years this time, so it will plunge.
Regarding the market issues, the plunge is not a prediction, in the industry or to keep learning iteration and value judgment. Because up and down market sentiment is one aspect, there will be more very important factors. For example, the big washout in March this year, so it can only be said that it will slow down the development of the FIL community. But prices are indeed unpredictable.
User2:
Actually, in the end, if there are no applications and no one really uploads data, the market value will drop, so what are the landing applications of EpiK?
Leo: The best and most direct application of EpiK’s knowledge graph is the question and answer system, which can be an intelligent legal advisor, an intelligent medical advisor, an intelligent chef, an intelligent tour guide, an intelligent game strategy, and so on.
submitted by EpiK-Protocol to u/EpiK-Protocol [link] [comments]

How The BlockChain System Works 101 - Blockchain For Dummies - Bitcoin Gangstas BITCOIN BIG MOVE NEXT!! Energy Standard, Trump, Davos, Lightning Network Bitcoin: Is the Block Chain Getting too Big? Blockchain Expert Explains One Concept in 5 Levels of ... HOW TO INVEST IN BLOCKCHAIN - YouTube

As another year draws to a close in the world of Bitcoin and blockchain, we’ve decided to look back at some of our favorite cryptocurrency tweets of 2019. After Bitcoin ‘Betrayal,’ Goldman Sachs Is Suddenly Betting Big On Crypto And Blockchain Billy Bambrough Contributor Opinions expressed by Forbes Contributors are their own. A Blockchain vs. The (Bitcoin) Blockchain. Banks currently feel excited about blockchain technology, and rightfully so. However, Bitcoin is a disruptive force because it wrests the power of money ... What happens when the blockchain gets too big? It will turn into a blockchain singularity and consume all life as we know it. It will be the end of the world. level 2. 1 point · 7 years ago · edited 7 years ago. and the Eschaton will rise. level 1. 2 points · 7 years ago. I'm at the point where I'm going to switch from the standard client to a "light" one. Bitcoin is taking up 20 GB of my ... While there are a few enthusiasts who are experimenting with building applications, there is still one missing piece of that puzzle and that is to be: scalability. What that means is that blockchains are limited in their ability to scale. Of cours...

[index] [30508] [8650] [10312] [42280] [42654] [24843] [42816] [47620] [35194] [3190]

How The BlockChain System Works 101 - Blockchain For Dummies - Bitcoin Gangstas

Subscribe here: https://goo.gl/9FS8uF Become a Patreon!: https://www.patreon.com/ColdFusion_TV The Blockchain and it's potential simply explained. Hi, welcom... BITCOIN BIG MOVE NEXT!! 🛑 Energy Standard, Trump, Davos, Lightning Network Ivan on Tech. Loading... Unsubscribe from Ivan on Tech? Cancel Unsubscribe. Working... Subscribe Subscribed ... Blockchain, the key technology behind Bitcoin, is a new network that helps decentralize trade, and allows for more peer-to-peer transactions. WIRED challenge... In this video I talk about the bitcoin block chain. I've gotten lots of emails asking if there is positive or negative effects of a ever growing block chain size. I discuss that and mush more in ... Cryptocurrency: Max Keiser Reports BitCoin Futures Contracts - Capitalize Before It's Too Late!: ... what is blockchain supply chain what is blockchain security what is blockchain good for what is ...

#