log in

Ask HN: What under-the-radar technology are you excited about?

Hacker News - Mon Apr 12 15:55


Wikidata, SPARQL, and RDF in general. And I guess semi-relatedly things like Prolog? I recently decided to fiddle with Wikidata, and it is fascinating to be able to query for any part of a statement, not just the value!

In SPARQL you write statements in the form

    <thing> <relation> <thing>
But the cool part is that any of those three parts can be extracted, so you can ask things like "what are the cities in <country>", or "what country harbors the city <city>", but most importantly, "how does <city> relate to <country>".

For example, if you wanted to find out all the historical monuments in state capitals of a country (using my home country as an example, also pseudocode for your time's sake):

    fetch ?monumentName, ?cityName given
    ?monument "is called" ?monumentName.
    ?monument "is located within" ?city.
    ?city "is capital of" ?state.
    ?city "is called" ?cityName.
    ?city "is located within" "Brazil".


I've been considering writing a graph database on top of SPARQL and RDF. Beyond the official docs (which are pretty good), can you recommend any other resources for easily getting the hang of SPARQL?

I'm really excited, to the point of distraction by the RISC-V ISA.

Some people say that there's nothing new in it, but to my mind, they're missing the point : the Berkeley Four took what couldn't be appropriated for profit, and built a statement about what computing is... They revealed the Wizard of Oz to everyone, so that anyone with some computing background can build a processor, freely.

And now this freed wizard is working his magic, and will change the computing landscape irrevocably.

> "...so that anyone with some computing background can build a processor, freely."

They could already do that. I designed and laid out in silicon a 32 bit processor as part of my undergraduate studies in computer engineering.

You do realize that RISC-V processors aren't free right? Making an open source ISA is no small feat, but all they really did was help save some megacorporations a little extra money.

Perhaps it will lead to a processor startup, but follow that to its logical conclusion: it takes a huge, profitable company to sustain processor delivery for years. There's a very good reason why only a handful of companies make the top 6 CPU architectures. There's still Synopsys ARC, Renesas, Atmel, PIC just to name a few of RISC-V competitors.

In reality, the Berkeley Four just made a handful of semi companies richer. WDC, NXP, NVIDIA, Microchip, etc. don't have to pay Arm for IP if they use RISC-V. Did that really help anything? Meh.

Sure, RISC-V designs can be open or not... And of course there's always the cost of fabbing.

There're already designs freely available to use though, either as they are, or to build upon.

And there are also now many other companies designing using the ISA; decentralising the production of chips.

But - over and above the revolutionary economics of it - it's being recognised as a good ISA, and RISC-V cores are already being incorporated into consumer electronics.

You could argue that the availability of Linux saved countless megacorporations from having to pay Microsoft, IBM, or Sun. Yet the availability of Linux has been a boon to people across the board.

While I agree there is something not right about cutting into ARM’s profits for the benefit of megacorporations, I think that a royalty-free ISA might genuinely be good for civilization despite that in the same way Linux is. It’s tough though, I’m still not fully sold on that opinion.


Homomorphic encryption, which enables you to process data without decrypting it. Would solve privacy / data security issues around sending data to be processed in the cloud

The extra cost is worrying. You're talking at 4 to 6 orders of magnitude increase in resource usage for the same computation.

Unless we figure out some awesome hardware acceleration for it, it's not practical but for a few niche applications.

It also has the problem that you can use computation results to derive the data, if you have enough control over the computation (e.g. a reporting application that allows aggregate reports).


Modern homomorphic encryption schemes only have a ~100x performance hit. That sounds bad, until you remember that computers are really fast and spend the majority of time doing nothing, and that the difference between writing a program in Python or C is already something like 10x.

Zero knowledge proofs for the win! This is one of the things I need to see in a cryptocurrency before I believe it will succeed at scale.

1 Zero-knowledge proofs,

2 shielded ledgers,

3 democratized and energy efficiency mining,

4 inflationary control, and

5 wallet recovery.

No one has all of these yet, but ZKP is a big part of it.

Former Paddle customer, don't believe the hype. Everything is duct tape and half assed with the API, the checkout process is happy path or bust, and you're going to have a hard time migrating to a different processor.

Stripe + TaxJar was cheaper and easier to implement and maintain.


I’m a customer, and it handles a lot of headaches for accounting. Basically you have 1 client - paddle, as far as the law is concerned. Plus has an easy checkout feature that’s inline for collecting payment info that never touches your server, no headaches securing that info.


Can you share an approximate percentage they take from a transaction for offering that service? Their pricing information seems to be hidden behind an email form.


Around the internet, I saw 5% transaction fee. But, like Plaid, they probably customize the fee based on the product sold and volume.


Dependent types and homotopy type theory for general purpose programming. I am definitely excited about it, but not really sure if those hopes will ever materialize as some useful (even in the limited scope) technology.


Same here. IPFS seems to have been lumped into the whole "crypto" scene because of filecoin. In my opinion, IPFS is much more interesting and potentially beneficial to individual freedom than crypto. The problem is the only people really talking about it are people focused on crypto, also forcing dapp conversations to focus on quick money vs. long term usability/functionality. I don't hold the money conversations against anyone...just not interested in that side of it. More interested in the preservation of knowledge long term and how/if that gets built. IPFS seems to be an interesting potential step towards those goals.


There's still a bunch of questions around filecoin, but if you set aside the token and speculative nature, the it's the first realistic proof of storage system and represents (1) a departure from proof of work that's sucking up the power output of several countries for no good reason, (2) a useful base for IPFS to store its bits persistently and (3) perhaps funding Protocol Labs so they can continue to advance this state of art.

This new crypto project just launched its mainnet a few weeks ago: https://chia.net

It's a novel "Proof of" algorithm (Proof of Space and Time) that front loads the resource needs into a Plotting phase, with a very efficient Farming phase after that to process blocks with transactions. Seems like a much more fair, sustainable model for having a secure digital currency.

It also has an interesting, Lisp based programming language on it.

But what excites me is that it's lead by Bram Cohen, the dude who invented BitTorrent, one of the best pieces of tech I've used nearly my whole tech life.


The people who have the most hard drive space will mine the most Chia. How is that any different to just paying with USD to get more? In terms of it being a "much more" fair model.

Not just the most hard drive space, but the very specific fast kinds of hard drives you need to farm it efficiently.

The idea that you can do a sustainable cryptocurrency that remains sustainable no matter how valuable the tokens become in real money terms is self-evidently ridiculous. There's always some limiting resource you'll hit first, and if the cryptotokens are worth real money, that resource will get scarce.

But Chia is a good example of a brilliant person being so seduced by a challenging technical problem that they lose any ability to see foundational problems that people with a tenth of the brampower would be able to spot instantly.

I think the big part is that farming with hard drives is a much more approachable thing to do than to try and mine BTC/ETH, considering both of those at a minimum require crazy GPU hardware, if not full blown custom chips, that no normal people will buy. Also, insane energy usage too, which Chia doesn't really have.

That being said, yes, with the right amount of investment, someone could try and take over the network. That being said, look how many Full Nodes are already in the network...

https://www.chiaexplorer.com/charts/nodes

Absorbent panels for aircraft and angular design that prevents reflection of emitted waves.

Oh wait, that's under-the-radar technology.

The GraphBLAS[1] is an amazing API and mathematical framework for building parallel graph algorithms using sparse matrix multiplication in the language of Linear Algebra.

There are Python and and MATLAB bindings, in fact MATLAB 2021a now uses SuiteSparse:GraphBLAS for sparse matrix multiplication (A*B) built-in.

[1] https://graphblas.github.io/

The GET (Guaranteed Entrance Token) Protocol https://get-protocol.io/

The protocol offers blockchain-based smart ticketing which elements fraud and prevents scalping. This has the potential to get huge when events start coming back post-covid.


Developments in the AR/VR space. The haptics stuff coming out is really cool and may be consumer friendly in the near future. Meanwhile headset tech is seeing major investment and light field technology is going to do amazing things for this space.


I have a Valve Index. It is the most mind blowing thing to me. I’m really excited for AR to get going. I’ve only seen it for shopping (see what item x would look like in your house). I think it might be cool for collaboration.


what do you like about it? (disclaimer, i work at temporal, but just genuinely interested in how you'd describe it in your words, good or bad lets hear it)


I would say that it allows one to write statefu, long lasting workflows or processes in a durable and persistent way using all the niceties of regular programming languages such as Go/Java, like control structures, abstraction, etc. In a similar way that generators in Python/JS allow one to write even complicated iterators much more easily than manually keeping track of state yourself, Temporal allows one to easily define long-lived stateful processes (on order of years even) with the simplicity of roughly:

    while(true) {
      Thread.sleep(1, MONTH);
      sendReminderEmail(user);
    }
...which would normally require one to manually keep track of state in queues and key-value stores and idempotent updates, but with temporal the developer can just focus on the simple business rules. The runtime of Temporal takes care of compiling that down into proper state-machine logic.


I may be able to answer your question. I only watched the video on the landing page and glanced at the docs, but it reminds me of what a saga system would do (like redux-saga perhaps), meaning that much of the side effects, such as networking, are abstracted away from the business logic, and there is the concept of compensation when things don't go as planned. Very neat!

yeah, "saga for the backend" is an appealing angle to some folks (eg our Coinbase testimonial has that) but i'm not sure how many developers know about sagas so I've been hesitant to push that too hard (bc then we'd have to teach people an extra concept before they get to us).

i'd say something that is maybe hard to appreciate until you really get into it is just how much goes into making this ultra scalable with distributed transactions. If you have ~20 mins for it, I wrote this up recently: https://docs.temporal.io/blog/workflow-engine-principles

Advances in computational chemistry and computational materials science. Specially using ML to speed up computation. Computational chemistry is already patchwork of different heuristic approaches. Graph neural networks and even some language models seem very promising addition.

If we could simulate and observe what happens with complex chemistry accurately, it would completely change the biology, medicine and materials science.

Yes! This would be fantastic, I'd love to see this happen.

If anyone is working on this, and is looking to hire a computational organic chemist-turned-ai engineer, let me know!

>If we could simulate and observe what happens with complex chemistry accurately

There is the rub. Biological simulations have been written for 40 years now. It's an extremely difficult problem considering how many latent variables are at play, and people have been working on it for a very long time now.

I suspect that the recent breakthrough in protein structure prediction is just a start.

ML techniques will be used to cut that latent variable space in both quantum chemistry and molecular mechanics based methods.

I recently came across Polymarket[1] and am really excited about what crypto has in store for prediction markets and insurance. I've dabbled in prediction markets over the years. I think they have a lot of promise but execution is always an issue. Using ETH to create a two-sided marketplace feels much better than having prediction markets decide what one can bet on.

More broadly, decentralizing insurance in this way would be very cool too... There's little difference, in my mind, between a prediction market predicting weather changes or elections, and insurance contracts around risk.

... and what's even cooler is: can we build bots and models to actually get an edge on these predictions? Imagine applying HFT strategies from stocks to predicting real-world events... Now it sounds like we can actually get good at forecasting difficult-to-predict human events, rather than just stock prices.

[1] https://polymarket.com/

Fees are borderline insane if you are a small volume trader.

If you’re in the US there is a regulated prediction market set to launch soon.

When I played with Augur, the community was very quiet and not a lot was going on. That was a big challenge for me. I also didn't like that you had to use their crypto to actually participate -- in Polymarket's case, I understand that they are using ETH smart contracts.

There are definitely a few players in this space and I'm excited to see where it goes.

Data Hubs.

It’s queriable like a database but it doesn’t store your data - it proxies all your other databases and data lakes and stuff, and let’s you join between them.

Trino is a great example.

I'm not too deep into the data side of things, but this is interesting to think about.

Aren't stuff like data lakes and warehouses supposed to address the need for a centralized datastore?

Outside of perhaps an easy-to-apply interface, what benefit would a data hub provide over just streaming duplicates from all of your databases into a single data lake like Snowflake?

Want to cross ref the erp database with that stuff team x has in a lake and join with what team z has in a dwh? You don’t need pipelines and jobs and working out where to store that data... you just need a hub! And you can query it ad hoc too.

We used to have to copy and shuffle data into centralized systems so we can query and join it.

Data hubs do away with all that. Stop needing to think about storage. Instead, it’s just about access.

There have always been fringe tools, eg I once did some cool stuff with MySQL spider engine. But modern systems like Trino (formerly called Presto) make it easy. And, I predict, will hit the mainstream soon.

> Stop needing to think about storage. Instead, it’s just about access.

100% resonates when you put it that way. Thanks for the explanation!


I don't think these are really under the radar. We have Hue. We also have other apps that act as data hubs, but are slightly more constrained.

I'm not sure this is exactly under the radar but I'm excited about wearable AR. No it doesn't look real but once you get past that we have some amazing tech on the horizon.

Microsoft and Oculus have hands free controls that actually work. Inside out tracking is progressing quickly. New UX patterns are getting forged.

I'm exited to see what we'll have in a few years time. In my mind its far more exciting than something like crypto but gets much less press.

> DNA digital data storage is the process of encoding and decoding binary data to and from synthesized strands of DNA.

If each "bit" of DNA can be either A, C, G, or T, why call that binary?


Any grid energy storage technology. Whether it's small home units (smart batteries, vehicle to grid, hydrogen packs...) or huge grid scale (liquid air, gravity, liquid metal batteries...)

Uncertainty quantification and OOD detection in machine learning. It's on some people's radar, but has the potential to get ML adopted much more widely as people understand what it is actually really good at, and stop giving it things to do that it's bad at.

For a great recent example that get at some of this, see "Does Your Dermatology Classifier Know What It Doesn't Know? Detecting the Long-Tail of Unseen Conditions" - https://arxiv.org/abs/2104.03829

I'm not affiliated with this work but I am building a company in this area (because I'm excited). Company is in my profile.


Narrowband IoT data service. Very low power, low bandwidth, cheap, with increasingly great coverage. It’s like an offshoot of LTE but classified as a 5G technology for massive numbers of low power internet connected devices.


I have to admit, the idea of a run-of-the-mill security-hole-ridden IoT device being exposed to the bare internet over a wireless connection that's out of my control gives me the heebie-jeebies... and that's ignoring the possibilities for malicious use (e.g. using a low powered wireless connection for uncontrollable surveillance/data collection).

The operators would argue this is where their "managed service" offering kicks in and gives you value.

Most operators haven't figured out their business model for NB-IoT quite yet (at least in Europe) - they're still dabbling. Some seem likely to try pair it with enterprise "private APN" type solutions. Under such a setup, you can actually get quite an interesting system in place - the operator locks the SIM to a custom APN, and that APN only allows comms to a managed, operator-provided IoT backend.

Then the operator's enterprise services team turns that into dashboards and other things the customer can use and access. In a sense, they're using "extra slack capacity" on their FDD cellular networks (as an NB-IoT carrier is only 200 kHz wide and can sit inside or adjacent to a regular 4G or 5G carrier), and delivering higher margin managed enterprise services.

Some other comments point out the potential to use LoRa - indeed, although if you can use LoRa, you probably aren't the target market for NB-IoT. If you want to deploy 50 million smart meters, a nationwide existing network and managed service from the operator starts to get appealing, as does them handling security, isolating the devices onto a private APN, and helping you update devices in the field.

If you are using LoRa, you need to handle this and deploy the infrastructure. To date though, I've seen lots of "unmanaged" NB-IoT testing taking place, but not a whole lot of the "full service managed offering".

Otherwise I would agree entirely with your point about connecting modern IoT devices to the internet, but in this case I think it will end up for enterprise type deployments where they're restricting that for you.

I am really surprised this isn't already more of a 'thing'.

When I did my postgrad research project, back in 2016, I was using LoRaWAN and thought it was so obviously going to be huge in e.g. AgriTech. Surprised not that much has happened with it tbh.


Cheap LoRa chips are finally hitting the market for long enough that ecosystems can grow around them. As an example, for Radio Control vehicles (planes, drones, cars) there have been 3 different systems released over the past year or so, and now an open source system called ExpressLRS which is gaining traction.

I've implemented some use-cases for t-mobile, they basically peddle our products as a whitelabel solution to municipalities. One of my biggest beef with nbIoT is that 90% of all the use-cases I designed would have been better of using LoRaWAN instead. The reason why nbIoT is chosen is obviously that it provides the operator recurring revenue via a data-SIM but apart from costs there are other massive drawbacks such as energy draw of an nbIoT enabled sensor when compared to LoRa.

nbIoT is justified if I know that the data volume of my "solution" might increase due to feature/scope creep (and replacing the battery/sensor isn't going to become an annoyance in 2-3 years at end of life).

For most use-case LoRaWAN makes more sense but doesn't have the same marketing budget that is available to T-mobile, Vodafone and Co.


I work adjacent to this space and LoRa is great (really great, the specs look impossible on paper) but everyone has been peddling proprietary cloud-based SaaS solutions that come and go every year, for about a decade now. For non-cloud/SaaS solutions, nothing seemed to be able to compete against the branding of wifi and bluetooth. I think this is the main reason it didn't take off. 802.11ah looked like it was going to finally become a good option, with wifi branding, but it somehow was never really released into the market.

I think Bitcoin is genuinely exciting. I know it's not exactly "under-the-radar", but a lot of the discussion on HN is unfortunately only related to the price. It's a bit of a shame, but I can understand it's hard to see through the noise sometimes.

The fact that there isn't more discussion around the cryptography, networking, etc. suggests to me that many are still unfamiliar with the power of the underlying technology.