tags:

views:

164

answers:

5

Can you suggest a precise definition for a 'value' within the context of programming without reference to specific encoding techniques or particular languages or architectures?

[Previous question text, for discussion reference: "What is value in programming? How to define this word precisely?"]

+1  A: 

Have you checked the article in wikipedia?

In computer science, a value is a sequence of bits that is interpreted according to some data type. It is possible for the same sequence of bits to have different values, depending on the type used to interpret its meaning. For instance, the value could be an integer or floating point value, or a string.

Macmade
I've googled but missed it, thanks.
Vag
Wikipedia's explanation is incorrect: "In computer science, a value is a sequence of bits that is interpreted according to some data type." If I build a computer without bits it still be value manipulating.
Vag
I know you're just quoting Wikipedia, but "sequence of bits" seems overly-specific about the representation of values, don't you think? I mean, sure, any other representation we choose may be isomorphic with "bits", but I really don't think that's a meaningful thing to encode into the definition.
Gian
Errrrrr... A computer without bits... Maybe you should start by reading the wikipedia article on computers... : S
Macmade
You know, I'm with you on that one. It would be more accurate if the article distinguished more clearly between the information (the value) and the internal representation (the bits.)
mquander
Macmade, I don't follow you. There are many computers without bits: http://en.wikipedia.org/wiki/Ternary_computer
mquander
@Gian: I'm perfectly agreed with you.
Vag
@Macmade: http://en.wikipedia.org/wiki/Analog_computer
Vag
Sure, but even with trits instead of bits, it won't make a great difference imho...
Macmade
@mquander, how many functioning computers are in operation that don't use bits? Ternary computing is interesting but I would have thought that Quantum computing and qubits would have been a more relevant argument against binary computing but there of course, the concept of a 'value' becomes even more nebulous! ;)
Lazarus
I don't think it's relevant -- point being that obviously a fundamental concept like "value" can't be defined in terms of an implementation detail like "bits in registers."
mquander
@Macmade, I'm a theoretical computer scientist. The concept of "bits" is uncomfortably low-level for me when talking about something as all-encompassing and general as "value". I can talk about lots of things that would not be sensible to encode as bits. In some settings, I can talk about an infinite sequence of things as a value, and I wouldn't really want to sit around and wait while you figure out a bit encoding for it.
Gian
@mquander, in computing term I think it absolutely can be defined in terms of bits in registers. Only if you are limiting your thought to high level languages can it be described as anything else. When you are down at the metal in assembly language you are only too aware of how a value is represented.
Lazarus
@Lazarus, if you accept that definition, then the ternary computer mquander linked above should be a counter-example to your definition.
Gian
@Gian, please post the Theoretical Computer Science definition for 'a value' as an answer, I suspect that would be the closest to what Vag is looking for.
Lazarus
I believe Turing spoke of "values" in his machine. The vast majority of Turing machines I've come into contact with over the years were sketches on blackboards, or electrical patterns in someone's brain, i.e., containing no bits at all. Though I suppose depending on your point of view, my brain could be an analog computer. Then again, some people have also claimed there's no value in my brain. :-)
Ken
Yes, but what Turing said about Turing machines is almost irrelevant today :)
Gian
@Gian, then you are getting caught up in terms. In binary computer science (possibly the most common accepted computer science) then we talk in terms of bit, in ternary computer science we'd use trits interchangeably, in quantum computing we'd use qubits. Irrespective of the word used the sentence remains the same "A value is represented as a collection of bits/trits/qubits in a register or memory store (whether electronic, mechanical, magentic, etc)." Is there a better definition in your field? Please share it as this is getting silly :)
Lazarus
I disagree that "binary computer science" is computer science at all, although I've certainly never heard of the term before. Most computer science can be done without reference to computers, and the rest is usually done with reference to machines abstracted far above the bit level. I can't think of much "computer science" that really cares how things are represented once implemented.
Gian
@Gian, you'll make a great IT Architect ;)
Lazarus
A: 

Read the Wiki

Wayne Werner
Read the previous answer :) http://stackoverflow.com/questions/3300726/what-is-data-value/3300766#3300766
Vag
+1  A: 

Here, I'll take a shot: A value is a piece of stored information (in the information-theoretical sense) that can be manipulated by the computer.

(I won't say that a value has meaning; a random number in a register may have no meaning, but it's still a value.)

mquander
If a value has no meaning is it still information?
Lazarus
Sure; a random number contains lots of information, but might have no meaning to anyone.
mquander
I'm upvoting this because I think it's the closest we can get, if we interpret "information" in the sense of "an ordered sequence of symbols". If we take that definition, it's isomorphic with the definition I gave based on Turing tape.
Gian
On reflection, I'm uncomfortable with "that can be manipulated by the computer". That refers to first-class values only. From first-class values we can have higher-order values that we can reason about, but which cannot be manipulated by the computer without encoding them into a first-class representation.
Gian
I see that this is a real substantial difference between our definitions. For example, you would say that any function is a value, since you can obviously represent it uniquely as some series of bits. I would prefer to use "value" to mean only simpler, more "atomic" things that really are manipulated directly by the computer. Perhaps your definition is better, but mine matches what is in my head when I hear "value."
mquander
@mquander, I'd think a random number only encodes 1 piece of information.. it's value, how you interpret that number is up to you and it's there the vaguery wells up. The value must have meaning to the computer, mustn't it? If it means nothing to the device (and I'm stretching 'meaning' a little as the computer is generally accepted as non-sentient) then how can it manipulate it?
Lazarus
@Gian, but the issue is very specifically about value in terms of programming and I'm assuming (a very bad thing I know) that it's also in terms of programming a computer.
Lazarus
Would "an information (in the information-theoretical sense) that can be manipulated by computer" be an improved version of your definition?
Vag
@mquander, there are plenty of programming languages that allow functions to be manipulated as first-class values (e.g. Haskell, ML, Scala etc).We also have to consider languages based on other parts of the lambda cube, where types are values and can be computed, or types may depend on values. It seem that the definition based on computation is unnecessarily restrictive.
Gian
It's all crazy in the end, words, letters and numbers are all just symbolic abstractions after all. Damn Heisenberg!
Lazarus
There is no spoon.
Michael Petrotta
@Gian: My intuitive understanding of `value` implies first class citizenship. Can you give an example of some non first class objects that are obviously values?
Vag
Any kind of computation on types seem to fit that definition.
Gian
Also, in the context of something like C, an array seems to be a value in the obvious sense, but cannot be manipulated as a first-class entity.
Gian
Give me a try on my own: 1. Without mentioning some programming language P with semantics S notion 'value' is meaningless. 2. If there exist reduction semantics RS that is equivalent to S then values are these terms of RS that reduces to themselves. 3. If there exist "imperative" semantics IS (i.e. semantics with one storage mentioned in it) that is equivalent to S then values are exactly elements storeable in the storage of IS. 4. If there are no such semantics definable as in (2) or (3) then in context of P word 'value' is meaningless.
Vag
ISO C: "value := precise meaning of the contents of an object when interpreted as having a specific type". http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1256.pdf
Vag
I'm not sure that I agree that values have to be in normal form with respect to a reduction semantics. What about lazy evaluation? It seems that one could conceivably manipulate values that were still reducible. I agree with point 3. Could you make the definition more complete to cover some kind of denotational semantics? Perhaps values are the atomic ground terms?
Gian
@Gian: "Could you make the definition more complete to cover some kind of denotational semantics?" No. My background in CS is not enough for that. This is your turn!
Vag
About lazy evaluation: reduction relation may be defined as beta reduction of head only. And that's it.
Vag
Is there strong need for denotational semantics case? For all DNs there is possibility to define equivalent operational semantics, so my definition cover it this way.
Vag
No, no strong need. It might just make it a little more compelling.As a matter of interest, why is it necessary to talk about specific semantics? I use the term "interpretation" in my definition, which is well-understood with respect to any semantics.
Gian
@Gian: "why is it necessary to talk about specific semantics?" There are many situations where interpretations occur but no values definable because lack of computations in context. So, basing on just 'interpretations' will be too general. I use semantics just to pin down computational context. Also (as in case with Haskell) it may be shown that what is value and what is not depending on which computation and what programming language alluded in current context.
Vag
+3  A: 

Based on the ongoing comments about "bits" being an unacceptable definition, I think this one is a little better (although possibly still flawed):

A value is anything representable on a piece of possibly-infinite Turing machine tape.

Edit: I'm refining this some more.

A value is a member of the set of possible interpretations of any possibly-infinite sequence of symbols.

That is equivalent to the earlier definition based on Turing machine tape, but it actually generalises better.

Gian
You might as well just say, a value is anything that can be mapped to an integer. That's the same statement and clearer. (edited slightly, "integer" instead of number)
mquander
But then the definition of "value" becomes contingent on the definition of "integer", and I don't feel like opening that can of worms. However it seems like that could be used to formulate an acceptable alternative-but-equivalent definition.
Gian
Why an integer? Surely a Turing complete machine could represent other numeric forms. After all an integer is just a convenient representation of a floating point number with an exponent of 0.
Lazarus
I think the take-home point is that any possibly-infinite sequence of symbols is sufficient. Once you have that, they all collapse into the same definition.
Gian
Yeah, but any other numeric forms it can represent can be mapped one-to-one to integers. For example, it can't represent an irrational number directly, unless you can define that irrational number as a function of something else. (But it can represent rational numbers, since they map to integers.)
mquander
Why "possibly infinite", not just "piece of tape"?
Vag
@mquander, but neither can humans. We use symbols to represent irrational numbers just as you would in a program.
Lazarus
Just to be very specific that we are not constraining ourselves to finite things. Infinite things can be values, even though we can't easily compute using them.
Gian
@Gian, your revised definition... couldn't that equally apply to any symbol in the computer program, not just a value?
Lazarus
Yes, it could, which as you correctly pointed out earlier, is actually the case in most real implementations. Code and data are manipulated and represented in the same way at the lowest level, we just use higher-level abstractions to separate them. We can move a pointer to a random piece of memory and dereference it. The result is a value, whether it is pointing to code or data. (small edit, added last sentence).
Gian
Fair enough :) This has been fun!
Lazarus
"A value is a member of the set of possible interpretations of any possibly-infinite sequence of symbols." This is too general. And, are there any objects left at all that are not values?
Vag
Sure. A cow is not a value, but if you line up a field full of cows and read them as unary notation, then that is a value represented by cows. Seems reasonable enough to me.
Gian
Take, for example, Haskell. In Haskell, types are not values. It is possible to conduct computations on them, but these computations programmed in not Haskell but in some embedded language in Haskell's type system. From that language's point of view, Haskell's types are values. Its computations are performed in compile time while Haskell's -- in runtime. So, it is incorrect to talk about values not mentioning some programming language semantics in context. I've tried my own definition in the comments to mquander's answer.
Vag
Finally I've understood you answer. There is great deal of sanity behind it. But I guess correct definition lies somewhere between ours.
Vag
I'm glad there is some method in my madness :p Also, I think you have persuaded me re: the first-class nature of values.
Gian
@Gian: Consider that one computer program had written another program using radio based random number generator and that program terminates and computes some value. There are no any interpretations made, but there are value. Just uninterpreted program that computes uninterpreted data.
Vag
I was using "interpretation" in the strict semantic sense, which could best be described as "universe of possible meanings". It's not the meaning as understood by a human, necessarily, rather the infinite set of all things that sequence _could_ mean. I argue that a value is any single element drawn from this set.
Gian
@Gian: Then, *any* notion A *is* a value because it can be interpretation of some sequence of symbols.
Vag
Yes, that does seem to be the effect of it. Can you think of an obvious counter-example where this would not make sense?
Gian
@Gian: If anything is a value then meaning of word 'value' is leibnicially equal to meaning of word 'notion' but these notions are distinct in my vocabulary. All that in context of programming, of course. For example, computer itself is not a value while not talking about some programming language which has native computer counter type.
Vag
@Gian: Can I ask is a something a value when no programming languages mentioned at all? I guess no.
Vag
Not quite, because it has to be derived from a possible interpretation of a sequence of symbols. It seems like "notion" doesn't have this requirement. It can capture uncertainty or non-determinism. It seems like our restriction that values be representable by sequences of symbols excludes direct representation of these things, rather we would be forced to quantify it in some way, or encode these concepts using values that do match our definition.
Gian
Yes, I think something can be a value by my definition without reference to a programming language. You probably need to talk about _some_ kind of formal or semi-formal definition though, otherwise the set of possible interpretations becomes something very close in powerset of everything in the universe.
Gian
@Gian: It can capture uncertainty or non-determinism". Okay, I was wrong about 'value'='notion'.
Vag
@Gian: It seems to me that we capture two slightly distinct notions: 'value in programming language' and 'potential value'. But in context of programming, not in context of logic, or mathematics, or formal systems, does it seems to you that my notion is more natural?
Vag
@Gian: Let define a logic with formal language consisting of single phrase `Abc`, which has single inference rule `Abc |- Abc` and single axiom `|- Abc`. Are any possible interpretations of this logic subjects to your definition? If yes, then any statement is value, because singleton sequence of symbols `Abc` may be interpreted as absolutely any statement.
Vag
Well, it's a sequence of symbols, so we could choose to interpret it as a logical formula (which is a value), or we could treat it as an encoding of a base-N number (which is a value), or as a string (which is a value), or in any number of other ways.
Gian
@Gian: Okay. But I still do not understand how can I state "In C, 'for' statement is not a value" using your 'potential value'. It is too general and independent; according to it, 'for' statement is always value.
Vag
@Vag, it's not a value because we exclude it from the set of possible interpretations by its definition. If we decided to just think of it as a string or something else, then it's obviously a value. Any piece of program text may be _interpreted_ as a value, but if we explicitly say it isn't, then that interpretation is not in the set of possible interpretations.
Gian
@Gian: Is set of all real numbers a value? Are uncomputable natural numbers (e.g. Busy Beavers) values? (In context of programming, of course, not computer assisted theorem proving). They never be manipulated by computer but can be encoded.
Vag
@Gian: Can your definition be simplified to "X is value iff X can be encoded"?
Vag
For the context of programming, I think we have to restrict ourselves to the set of possible _finite_ interpretations. As for the second comment, I think it needs to be a sequence of symbols. Otherwise any matter in the universe is an encoding of itself.
Gian
This was a really good question, by the way, and this has been very interesting. It's unfortunate a lot of people seemed to miss that their intuitive use of the term "value" belies the complexity of the concept it actually represents.
Gian
@Gian: this is an instance of widespread mistake: to use meanings that are not total but images of couple of particular cases. One of symptoms of that is 'definitions' like "X is anything resembles Y or Z".
Vag
@Gian: math people are less subjected to this bug: http://mathoverflow.net/questions/30381/definition-of-function
Vag
@Vag - What's the mathoverflow answer to your definition of value within the context of math?
Lazarus
@Gian: Isn't interpretation the assignment of meaning?
Lazarus
@Lazarus, yes, in the loosest sense. The definition we've constructed is not contingent on assigning meanings to things for them to be values though, rather it just relies on the inclusion in the set of all such possible interpretations.
Gian
@Gian: I'd argue that it's assignment of meaning in the narrowest sense but we're into arguing English language semantics. I see the breadth of your definition but I don't think I can get past the word 'value' and the assignation of value as giving something worth and therefore meaning and/or context. It's probably my age :)
Lazarus
It's true that we are relying upon the ability to assign meaning to make the definition work, but instead of insisting that a sequence of symbols have a meaning, we rather just insist that it may become a value if the set of meanings we could give it is non-empty. I think we're basically agreeing with each other here :)
Gian
@Gian: I think we are! That clarification that 'value' is only applicable if the set of meanings is not empty makes sense to me. It was the inference that there is value with an empty set of meanings that caused me difficulty. Your definition is definitely better as an abstract of value, mine perhaps too rooted in the act of programming so +1 to you ;)
Lazarus
A: 

I just happened to be glancing through Pierce's "Types and Programming Languages" - he slips a reasonably precise definition of "value" in a programming context into the text:

[...] defines a subset of terms, called values, that are possible final results of evaluation

This seems like a rather tidy definition - i.e., we take the set of all possible terms, and the ones that can possibly be left over after all evaluation has taken place are values.

Gian
@Gian: Note that it coincides with my previously proposed definition perfectly. One question remains what if semantics of language is given without notion of evaluation? It may be assembly-like language, for example. My definition works well with this case.
Vag
Pre-state/post-state seems a reasonable analog of evaluation in higher-level languages?
Gian