Yeah, forcing physical-world norms into technology will always result in weird shit like this.
For the "real" world, Research would be studying different compounds to see which ones work well as anode or cathode.
Development would be creating the industrial processes required to scale up manufacturing, or the ancillary infrastructure to support the new battery, or designing a new package for this awesome new cell. All the things that take the new thing from the lab to a marketable product.
So if you come up with a new method for welding ("My new filler alloy reduces argon requirements by half!" or "My new pulsing methodology results in 23% stronger welds between dissimilar alloys.") That's Research. Then the Development of that might be "How do we manufacture these new filler rods to the exacting specs required?" or "We need to have the EE people incorporate my pulsing algo in our welders. Right now it's running on an Arduino in the lab, we need it included in next year's Welder XL4000 model."
Actually doing the weld is just doing the job.
So, back to software. What kind of coding is considered R&D and what is considered "just doing the job"? I guess creating new algorithms, or new features that you expect to be in the product for years to come; those would be R&D. Whereas fixing bugs, working on Kubernetes stuff, writing database backup routines, etc. would not be?
I don't know. This is just my impression of the difference. I'm no economist.
> So, back to software. What kind of coding is considered R&D and what is considered "just doing the job"? I guess creating new algorithms, or new features that you expect to be in the product for years to come; those would be R&D. Whereas fixing bugs, working on Kubernetes stuff, writing database backup routines, etc. would not be?
I would still count the later as R&D. It's akin to having an industrial engineer re-design the manufacturing floor to accommodate for the different manufacturing process of the anodes.
As soon as you need to customize something, it becomes R&D (else you would have purchased it). The "just the job" part is invisible because, well, it's the machine who's doing it (applications auto-start, install, send updates).
For the "real" world, Research would be studying different compounds to see which ones work well as anode or cathode.
Development would be creating the industrial processes required to scale up manufacturing, or the ancillary infrastructure to support the new battery, or designing a new package for this awesome new cell. All the things that take the new thing from the lab to a marketable product.
So if you come up with a new method for welding ("My new filler alloy reduces argon requirements by half!" or "My new pulsing methodology results in 23% stronger welds between dissimilar alloys.") That's Research. Then the Development of that might be "How do we manufacture these new filler rods to the exacting specs required?" or "We need to have the EE people incorporate my pulsing algo in our welders. Right now it's running on an Arduino in the lab, we need it included in next year's Welder XL4000 model."
Actually doing the weld is just doing the job.
So, back to software. What kind of coding is considered R&D and what is considered "just doing the job"? I guess creating new algorithms, or new features that you expect to be in the product for years to come; those would be R&D. Whereas fixing bugs, working on Kubernetes stuff, writing database backup routines, etc. would not be?
I don't know. This is just my impression of the difference. I'm no economist.