PDA

View Full Version : I have a physics question.


Sir Penguin
08-11-2005, 17:56:04
Is there a way to encode a physical system such that it can be reproduced from the encoding later?

For example, if we encode some particle as <id, type, anchor, bearing, distance> where:

- id is a unique ID number
- type is the type of particle (for example, an element number and isotope if we're working on the atomic level, or some other uniquely-identifying property on the sub-atomic level)
- anchor is the ID number of the nearest particle that isn't this particle
- bearing is a unit vector pointing from this particle to the anchor particle
- distance is the distance to the anchor

The big problem that I see is that there can be infinite bearings, but they have to be stored in a vector that is representable in a finite number of data. Is there some scale of particle for which a slight truncation error in bearing on a comparatively short vector (say, the distance between two such particles in a gas) is negligible?

Another possible problem is the type. The class of types has to be picked such that there are finite types (preferably just a few). Also, the type class picked must be small enough that there are a few replicable types, but large enough that it's possible to take a bearing.

This is assuming a static system. If it's not possible to encode velocity and location, is it possible to approximate both in a way that won't cause too much chaos when it's decoded? Is there any way at all to approximate them?

SP

The Norks
08-11-2005, 18:27:20
could you say that again?

Sir Penguin
08-11-2005, 18:35:13
Yes. Would it make a difference?

SP

Venom
08-11-2005, 18:37:41
No. You can't make a transporter.

Sir Penguin
08-11-2005, 18:39:19
I don't want to, I want to know how many bits are required to store the Earth, and I want to estimate its informational entropy.

SP

Drekkus
08-11-2005, 21:58:03
Originally posted by Venom
No. You can't make a transporter. :lol:

Beta1
08-11-2005, 22:19:56
its what I thought he was up to.

LoD
08-11-2005, 22:22:28
Where is the velocity vector?

Sir Penguin
08-11-2005, 22:37:58
In the part of the post that you didn't read.

SP

Immortal Wombat
08-11-2005, 22:44:56
Wouldn't the anchor need to be the ID number of the nearest particle that has a smaller ID number than this particle? Otherwise you'd just get paired particles floating around.

Sir Penguin
08-11-2005, 23:15:07
That's a good thought. It can be dealt with in logic though; it doesn't affect the data. It would have an impact on the entropy.

SP

LoD
08-11-2005, 23:30:44
Originally posted by Sir Penguin
In the part of the post that you didn't read.

Right, sorry. Well, I'm no physicist, but logic tells me that even if you would be able to encode such a model (ie. one that stores information on both velocity and location), there's a question of whether it would be correct.

Also, you weren't serious about encoding the entire Earth, were you :)?

Lurker the Second
08-11-2005, 23:31:25
I gave a seminar on this just a few weeks ago. I'll see if I can dig up my notes.

Sir Penguin
08-11-2005, 23:37:56
Originally posted by LoD
Right, sorry. Well, I'm no physicist, but logic tells me that even if you would be able to encode such a model (ie. one that stores information on both velocity and location), there's a question of whether it would be correct.
That would be in the other part of the post that you didn't read.
Also, you weren't serious about encoding the entire Earth, were you :)?
Why not?

SP

Colon
09-11-2005, 00:09:39
Originally posted by LoD
Right, sorry. Well, I'm no physicist...

This is about physics? I tought he was posting about coding.

Colon
09-11-2005, 00:10:23
No, wait, this IS about coding.

Colon
09-11-2005, 00:21:30
I feel stupid.

Sir Penguin
09-11-2005, 01:10:51
That's nothing compared to how you look. Boo-yah! I kill me.

SP

Koyaanisqatsi
09-11-2005, 01:56:00
Wouldn't the storage requirements of such a system be impossible? Given that you need significantly more particles to describe the system than are in the system itself, you would need storage with more mass than the Earth...and I don't think compression could get you out of the problem, either.

Sir Penguin
09-11-2005, 02:41:33
I don't care if it's impractical. It's interesting. If you require that it be practical, you could use the same technique to estimate how many data are required to encode a teacup, or a cat. I don't know how you'd use that information (unless you were an Intelligent Design advocate), but at least you'd be able to do it.

SP

LoD
09-11-2005, 09:21:09
Originally posted by Sir Penguin
That would be in the other part of the post that you didn't read.


I read that, and was echoing your doubts.
On the matter - Heisenberg's uncertainty principle deals originally with location and momentum, not location and velocity. So it could be mappable if you store no mass, but you'd have to ask a physicist to make sure.


Why not?

SP

Because it would be impossible given the current limits in storage and computational power. But, since you don't care about that...

Originally posted by Colon
This is about physics? I tought he was posting about coding.

It's about both I think :).

Originally posted by Koyaanisqatsi
Wouldn't the storage requirements of such a system be impossible? Given that you need significantly more particles to describe the system than are in the system itself, you would need storage with more mass than the Earth[...]

I beg to differ. SP stores a rather simple particle modle, so you could store the data about many, many particles in one physical particle.

Kitsuki
09-11-2005, 09:24:09
This thread almost made me cry.

Funko
09-11-2005, 09:25:04
"Heisenberg's uncertainty principle deals originally with location and momentum, not location and velocity."

In physics terms they are the same. You can't just ignore mass and momentum is the component of the equation that includes the velocity.


That's why the transporters in Star Trek have "heisenberg compensators".

MattHiggs
09-11-2005, 09:33:40
A star trek reference - this thread has surely reached the lower depths of nerdiness?

Funko
09-11-2005, 09:34:13
I was trying my best.

Nills Lagerbaak
09-11-2005, 09:38:38
Originally posted by Funko
"Heisenberg's uncertainty principle deals originally with location and momentum, not location and velocity."

In physics terms they are the same. You can't just ignore mass and momentum is the component of the equation that includes the velocity.


That's why the transporters in Star Trek have "heisenberg compensators".


Momentum is not directional is it? Therefore they are not the same.

Mr. Bas
09-11-2005, 09:40:22
Momentum is directional, defining it as mass times velocity is a bit simplistic though. Massless particles also have a nonzero momentum for example.

Funko
09-11-2005, 09:47:36
Yeah, momentum is definitely directional but energy isn't.

If I remember my school physics problems conservation of momentum always had Cos Thetas in them...

Oh and massless particles have momentum if they are moving don't they? If they are moving they have energy and therefore intrinsically also have mass?

Nills Lagerbaak
09-11-2005, 09:58:16
Ah yes, cue (gettit?) the conservation question with the snooker balls.

Funko
09-11-2005, 10:01:55
Exactly.

Funko
09-11-2005, 10:02:40
(the perfectly elastic snooker balls colliding on a frictionless baize...)

Mr. Bas
09-11-2005, 10:03:27
Massless particles are always moving with the speed of light, otherwise they'd have zero energy and zero momentum. Usually, mass is considered as the rest mass and in that view they remain massless, but if you want to use relativistic mass than you could define a nonzero mass for such a particle.

Funko
09-11-2005, 10:04:55
I always use relativisic mass*!






*this is a lie.

MoSe
09-11-2005, 10:37:53
would that make you chubbier or slimmer?

KrazyHorse
09-11-2005, 11:46:03
Shorter in your direction of motion, but heavier.

KrazyHorse
09-11-2005, 11:51:17
Is there a way to encode a physical system such that it can be reproduced from the encoding later?

Yes. Of course. Actually doing it for something macroscopic is, of course, ridiculous.

Koyaanisqatsi
09-11-2005, 13:06:01
Originally posted by LoD
I beg to differ. SP stores a rather simple particle modle, so you could store the data about many, many particles in one physical particle.
But you would still need to index them all...and also index an 'anchor' particle for each particle, as well as storing the relationship between the two...so even without storing any information beyond a label, you're already up to at least several times the mass. Add in that you would have to store a significant bitstring just to uniquely identify every particle and you're already up to many, many times as much mass in the storage medium as was in the original matter. And I won't even hazard a guess as to the mechanisms for actually storing or accessing the stored data, which presumably add a not insignificant mass on top of the storage itself.

Drekkus
09-11-2005, 13:51:18
Wait a momentum! Is this about that gay mister Zulu?

Lurker the Second
09-11-2005, 14:09:50
Dorks.

Japher
09-11-2005, 14:57:41
hehe

Nills Lagerbaak
09-11-2005, 15:03:39
Originally posted by Koyaanisqatsi
But you would still need to index them all...and also index an 'anchor' particle for each particle, as well as storing the relationship between the two...so even without storing any information beyond a label, you're already up to at least several times the mass. Add in that you would have to store a significant bitstring just to uniquely identify every particle and you're already up to many, many times as much mass in the storage medium as was in the original matter. And I won't even hazard a guess as to the mechanisms for actually storing or accessing the stored data, which presumably add a not insignificant mass on top of the storage itself.

Couldn't you simplify it further though, by using the same anchor particle for all the particles then just storing the x,y,z coordinates of each particle?

LoD
09-11-2005, 16:04:54
Originally posted by Koyaanisqatsi
But you would still need to index them all...and also index an 'anchor' particle for each particle, as well as storing the relationship between the two...so even without storing any information beyond a label, you're already up to at least several times the mass. Add in that you would have to store a significant bitstring just to uniquely identify every particle and you're already up to many, many times as much mass in the storage medium as was in the original matter. And I won't even hazard a guess as to the mechanisms for actually storing or accessing the stored data, which presumably add a not insignificant mass on top of the storage itself.

What has mass got to do with that? The data should be stored in those properties of the particle that are easily readable, possibly non-mutable via reading, and easily mutable via writing. Mass does not fulfill at least one of those conditions. There are other properties which can fulfill those conditions, such as spin states.

Sir Penguin
09-11-2005, 16:44:10
Originally posted by Nills Lagerbaak
Couldn't you simplify it further though, by using the same anchor particle for all the particles then just storing the x,y,z coordinates of each particle?
That was my original model, but the problem is the finite nature of the bearing vector. Since we can only store discrete bearings, error is multiplied the longer the vector gets. It takes fewer bits to represent a distance from an absolute point than it does to represent an ID and a distance from a relative point though, so it's possible we could save some space by having sort of super-particles, made up of several particles that are close to an absolute point, and store those absolute points as in my new model, but I think that makes things more complicated than they need to be.

SP

Sir Penguin
09-11-2005, 16:47:18
Can I just settle the question of whether this can be stored by invoking Quantum Computing, and the fact that if we could read qbits without destroying them, they would provide infinite storage, thus solving all the problems presented here?

OK, now that's done with, let's proceed with calculating the amount of information stored in the Earth.

Originally posted by KrazyHorse
Yes. Of course. Actually doing it for something macroscopic is, of course, ridiculous.
Why?

SP

LoD
09-11-2005, 17:03:34
Originally posted by Sir Penguin
OK, now that's done with, let's proceed with calculating the amount of information stored in the Earth.

I think SP has been reading too much Douglas Adams lately...

Qaj the Fuzzy Love Worm
09-11-2005, 17:52:54
Originally posted by Sir Penguin
OK, now that's done with, let's proceed with calculating the amount of information stored in the Earth.

Using similar handwaving to that with which you demonstrated infinite storage, I have used my powers of deduction to calculate the following answer for you: Lots.

Sir Penguin
09-11-2005, 18:14:37
That wasn't similar at all!

SP

LoD
09-11-2005, 18:40:11
OK, I'll try to better that then:
A negligable amount if you disregard impracatability and impossibility of implementation using state of the art means.

Sir Penguin
09-11-2005, 19:31:14
You people have no imagination.

SP

The Bursar
09-11-2005, 20:19:47
You couldn't do it on a subatomic scale. On an atomic scale, I would've though the Earth is dense enough that you can store finite bearings.

The Mad Monk
09-11-2005, 20:46:56
Only if you leave out magma chambers, the outer core, and depending on the time frame, the entire mantle.

Drekkus
09-11-2005, 21:49:23
I'm dense enough, but my bearings are very indiscrete.

Koyaanisqatsi
09-11-2005, 22:37:46
Originally posted by LoD
What has mass got to do with that? The data should be stored in those properties of the particle that are easily readable, possibly non-mutable via reading, and easily mutable via writing. Mass does not fulfill at least one of those conditions. There are other properties which can fulfill those conditions, such as spin states.
You'd still need a particle.

I think the obvious solution is that the most efficient storage mechanism would be the object itself. It's just reading and writing that's the problem at that point. Particularly if SP wants to invoke the 'insert miracle here' principle of quantum computing...then all of the calculations would be made simultaneously as well and the only problem is reading the source object and writing the copy. Which, as was pointed out, requires a Sufficiently Advanced Technology like Heisenberg compensators.

LoD
09-11-2005, 22:49:41
Originally posted by Koyaanisqatsi
You'd still need a particle.[...]

Yes - but you could describe multiple particles with one.

Koyaanisqatsi
09-11-2005, 23:35:59
Yes, but you still have to store the relationship somewhere, and that can't be compressed because it required a unique identifier.

Sir Penguin
09-11-2005, 23:42:08
Originally posted by The Bursar
You couldn't do it on a subatomic scale. On an atomic scale, I would've though the Earth is dense enough that you can store finite bearings.
Do you mean that on the subatomic scale you actually can't do it in real life, or you mathematically can't make a model that would approximate a description of the Earth (and, if the latter, why not?)? I think you're right about the correct choice being the atomic level. You still can't have a finite bearing, of course, but the error is small enough. It also makes the data a lot more uniform, because the space between the nucleus and the electrons doesn't have to be accounted for.

SP

Sir Penguin
09-11-2005, 23:43:27
Originally posted by Koyaanisqatsi
Yes, but you still have to store the relationship somewhere, and that can't be compressed because it required a unique identifier.
The identifiers are just bits though, which are easily compressed.

SP

Koyaanisqatsi
09-11-2005, 23:45:24
Not down to anything approaching 1:1

Immortal Wombat
10-11-2005, 01:05:25
Originally posted by Sir Penguin
Do you mean that on the subatomic scale you actually can't do it in real life, or you mathematically can't make a model that would approximate a description of the Earth (and, if the latter, why not?)?I mean: Since electrons are non-localisable, they would have neither bearings nor next-closest-particles.

Sir Penguin
10-11-2005, 04:23:15
It's a static system though, and since we're not taking any actual measurements, as long as the electron is there, there's a bearing and distance to the nearest particle (it doesn't matter what it is, it just matters that it's representable).

SP

LoD
10-11-2005, 07:06:00
Originally posted by Koyaanisqatsi
Not down to anything approaching 1:1

A good compresion algorithm should be far, not near, the 1:1 input/output size ratio :).
We weren't discussing compression, BTW. I was arguing with you that one could store information on multiple particles in one particle, as long as stored stored model is simple. With or without compression.

Koyaanisqatsi
10-11-2005, 07:52:38
1:1 ratio of particles in storage to particles being described. :p

But beyond that...

According to a completely random website with questionable methodology, there are ~1.33x10^50 atoms on Earth. (Sticking to atoms just because I don't want to deal with subatomic stuff.) In order to store a unique identifier for each atom on Earth, you need an address 167 bits long. In order to create and association, you need to store two addresses and the association data (vector and which one is the source.) So without even counting the type, association, or vector data, you need at least 501 bits of uncompressed data per atom in the earth. These are unique strings and thus you can't just make references so that multiple atoms are described by a single string. How do you store 500+ bits in a single particle without resorting to quantum magic?

In other news, anybody have a cure for insomnia?

Immortal Wombat
10-11-2005, 08:02:54
Keep staying awake until it goes away.

Greg W
10-11-2005, 08:33:46
Originally posted by Koyaanisqatsi
In other news, anybody have a cure for insomnia? Yeah, reading this thre..... :sleep:

LoD
10-11-2005, 09:31:37
Originally posted by Koyaanisqatsi
[...] How do you store 500+ bits in a single particle without resorting to quantum magic?

First of all, SP will probably thank you for doing the math for him :).
Secondly, two ways:
1. As you've already suggested, compression. It is not true that unique IDs always somehow interfere with the process of compression or make compression worse. I can show the (very simple proof) if you want.
2. Actually resort to quantum "magic".

Koyaanisqatsi
10-11-2005, 10:05:14
Originally posted by LoD
1. As you've already suggested, compression. It is not true that unique IDs always somehow interfere with the process of compression or make compression worse. I can show the (very simple proof) if you want.

How? I was under the impression that for any given lossless compression algorithm that is trying to compress every possible bitstring of a given length, there would be at least as much bloat caused by cases where the algorithm increased the length of the string as there is savings produced by cases where the algorithm reduced it.

Admittedly, I got this impression from an instructor that seemed to not know what he was talking about every once in a while, but...
2. Actually resort to quantum "magic".
Oh, well, quantum magic, that's different...

Koyaanisqatsi
10-11-2005, 10:05:38
Originally posted by Greg W
Yeah, reading this thre..... :sleep:
100-0

LoD
10-11-2005, 10:42:21
Originally posted by Koyaanisqatsi
How? I was under the impression that for any given lossless compression algorithm that is trying to compress every possible bitstring of a given length, there would be at least as much bloat caused by cases where the algorithm increased the length of the string as there is savings produced by cases where the algorithm reduced it.

Admittedly, I got this impression from an instructor that seemed to not know what he was talking about every once in a while, but...

This was probably the case this time as well - for example, have you every seen a zip compression ratio greater than 100% :)?

Here's a proof:
Let us first define a unique identifier - in this case, we define one such that the 1st identified item has an ID of A, the second AA, and so on. In genral, let us define a shorthand notation (here I use . as the concactenation operator) :
A^k where k is a positive natural number.
A^1 = "A"
A^k = A^(k-1)."A";

Let C(n) be a compression function that takes in a character string n. C(n) is defined on the set of the unique IDs described above as follows:
C("A") = "A"
C(A^k) = "kA"

Now, we have:
1. A lossless compresion function,
2. with a compression ratio always better than 1:1 (apart from k <= 2),
3. applied on a unique identifier.

Having said that, you are probably right that we would need space up to 501 bits to store the identifier*. However, "up to", and "always" are two totally different things.

*Because the cardinality of the set of unique values is rougly equal to the cardinality of the set of possible values.

Sir Penguin
11-11-2005, 22:07:56
Here's an easier proof, using all 24-bit strings:

http://www.csc.uvic.ca/~nrqm/pic/words.jpg

SP