- Joined
- Mar 2, 2007
- Messages
- 15,351
- Reaction score
- 2,664
The abstracted answer is that it will require further axioms to define moral outcomes and what it means for them to be good and bad. Measuring, I'm not sure I'm comfortable with that word. "Evaluating" is better. I'd take inspiration from groups, ZFC, and probably quite a few other parts of mathematics, and basically try to reify the previously mentioned aspects into mathematical objects that can be subject to binary operations and mappings. Moral questions would then behave much like sets, and asking moral questions would be done through unions and disjunctions of sets.
I don't think I'd actually be able to do that, though. Again, I'm not worshipping my own intellect. The best I realistically could do is to create some abstracted lists of axioms that are engineered to justify certain aspects of modern human society that I find good (democracy, constitutional rights, equality) and make some basic observations about what they imply and what they eliminate.
Very good Mr Brothir, what a delicious specimen you are.
Now for you to put in the sequence of operations and monitor it's outcomes requires the input of your own moral agency. You yourself would put in your moral framework so that it would do calculation to benefit all of man. What it seems to me on what Jordan is saying is that you still are applying your own pirori morality for the scientific tool of mathematics to solve as many aggregations as possible for the benefit of humanity. Before you evaluate you have to see that your experiment justifies humanity as an end and not a means. You as a human being cannot physically go through all those outcomes all at once so you would need an AI tech of sorts to do the job. However if that AI tech approaches the likes of a singularity then what outcome could that have? Would your own morality be sufficient enough to prevent an unseen catastrophe from the AI itself wherein it's own creation tethers to mans existence etc?
It is fun to think about.
Last edited: