Isn't that "for some reason" in C# being it's the standard way of doing dependency injection and being able to unit test/mock objects?
I've found it easier to work in C# codebases that just drank the Microsoft Kool-Aid with "Clean architecture" instead of Frankenstein-esque C# projects that decidedly could do it better or didn't care or know better.
Abstraction/design patterns can be abused, but in C#, "too many interfaces" doesn't seem that problematic.
I agree with you on this, my issue is mainly when they bring this thinking with them into other languages. I can easily avoid working with C# (I spent a decade working with it and I’d prefer to never work with it again), but it’s just such a pain in the ass to onboard developers coming from that world.
It may be the same for Java as GP mentioned it along with C#, but they tend to stay within their own little domain in my part of the world. By contrast C# is mostly used by mid-sized stagnant to failing companies which means C# developers job hop a lot. There is also a lot of them because mid-sized companies that end up failing love the shit out of C# for some reason and there are soooo many of those around here. Basically we have to un-learn almost everything a new hire knows about development or they’ve solely worked with C#.
> I've found it easier to work in C# codebases that just drank the Microsoft Kool-Aid with "Clean architecture" instead of Frankenstein-esque C# projects that decidedly could do it better or didn't care or know better.
I agree, for the most part. There's a little bit of a balance: if you just drink the kool-aid for top level stuff, but resist the urge to enter interface inception all the way down, you can get a decent balance.
e.g. on modern dotnetcore. literally nothing is stopping you from registering factory functions for concrete types without an interface with the out-of-the-box dependency injection setup. You keep the most important part, inversion of control. `services.AddTransient<MyConcreteClass>(provider => { return new MyConcreteClass(blah,blah,blah)});`
While I do occasionally encounter people rambling on about nothingburgers in meetings, I get more irked that the status quo at my workplace is people just sit there silently, contributing nothing in medium sized (6-12 people) meetings. It just seems like nobody wants to risk saying something that might be challenged or "seem dumb", as if they are all suffering from imposter syndrome.
Recently a fellow developer was doing a demo of an automated UI testing suite and how it could apply to our product, and when it came time for questions or to show any sort of interest at all, its just crickets. I feel obligated to participate in situations like these, reach for questions or at least acknowledge other's hard work, because nobody else seems to want to. For me its frustrating, I wish I were surrounded by people that are more willing to give their 2 cents, even if it means a little bit longer meeting, rather than staring at a sea of blank faces that don't bother to queue their mic for the entire meeting.
6-12 people isn't a meeting, it's a presentation. At that size not everyone can contribute and most people probably don't need to be there. Meetings of that size are generally either "update the boss" meetings where everyone goes one by one and says what they have been doing. These are a terrible waste of everybody's time, EXCEPT for the boss and so can sometimes be justified. Or they are presentations from one person to the group. If you find there is no interaction or feedback from the rest of the people in your presentation, it is probably either a bad presentation or you are presenting to people who don't want or need to be there.
I really think this is a backwards way to look at it. Work typically happens in an organization, and in organizations you don't always do the most optimal thing for yourself as an individual. I know everyone on HN would love it if they just got a steady stream of tickets into their inbox, never had a meeting about anything ever, and only interacted with git and HN. But that's not how the world works. Being attentive and engaged for 30 minutes while you get information that may very well make your job easier is not a big ask.
I love love love working from home full-time but this is my chief complaint about it - before COVID (at least in the smaller places I worked), if you brought your laptop into a meeting, spent the entire time typing, and didn't engage anyone, you'd probably get either a warning or it would be your last meeting. It forced people to actually pay attention and not just pretend like they were, or at the very least risk getting called out for it.
And while 12 is certainly pushing it, I've definitely been in productive working meetings with 6-8 people where all have been contributing.
Having worked from home for 20 years now, before COVID people were more careful to work within the medium (at least in the places I worked). Someone blissfully playing on their laptop ignoring what's going on around them was never an issue as you would never be put in that situation to begin with.
Since 2020, it seems everyone wants to carbon copy the office at home. I expect because they haven't (yet) developed WFH skills, which is understandable, especially in organizations that were abruptly thrusted into the home without experienced leadership to guide those unfamiliar with the environment along.
An alternative is to target your presentation at the people who will be in the room and prepare a bunch relevant questions/anecdotes/start a discussion with a couple of people you know will participate. As in—know your audience. This usually relaxes everyone and kicks off the interaction.
My experience is that nobody wants to ask the first question because it sets the bar. But that doesn’t mean nobody wants to participate.
Like, "Bob? What do you think of this? Would you ever use it?"
It's actually kind of obnoxious to call on people like that. Even if it may seem like a "leadership thing" -- which may explain why it seems such a favored technique among wannabe alpha manger types.
I'm with the parent commenter: if people are in the flow (and give a shit), they'll definitely have something say (and your difficulty will be in getting them to keep it short). If they're not, and you're getting crickets -- it points to a deeper problem. That cannot be solved by, in effect, throwing chalk at people to get them to speak up.
My assumption is that if you're in the meeting, it has some adjacency to your work. This isn't about calling on the daydreaming kid in high school Spanish class who _has_ to be there to graduate. If you're in the meeting, it should be applicable to you and you should be ready to give some input; even saying something like "I'm not sure, need more info", or "don't have anything to add" is a valid and acceptable answer.
If the meeting isn't germane to your work, why are you in it?
I know people who do this because they're genuinely trying to get input from a broad set of people, some of whom will never speak unless they're asked directly. It doesn't have to be a mark of an alpha trying to beta everyone else.
>My assumption is that if you're in the meeting, it has some adjacency to your work.
This assumption does not match up with my experience.
>If the meeting isn't germane to your work, why are you in it?
I personally have gotten pretty good about declining meetings, but plenty of people aren't. Besides, I've been occasionally asked by my direct manager to attend a meeting that it turns out I wasn't actually needed at or remotely interested in for my work.
It would be nice to live in a world where meetings worked ideally and there weren't a bunch of people there wasting their own time, but that is sadly not the world we live in.
If it's not a meeting that you have any applicability to and someone asks for your input in the meeting, say so.
"Sorry, I don't see myself using this product/service/team - not because it isn't good, it just isn't relevant to what I'm responsible for/in charge of".
I feel like people treat meetings like this inescapable prison; once you're invited, you can never escape! It's bonkers. If you don't need to go to the meeting or don't have applicability… don't.
I work for a Fortune-listed company -- exactly the kind of place where attendee bloat thrives and I've never once had any manager or supervisor aggressively push back on my declines if they are valid.
But throwing chalk works. Don’t get me wrong I wish all the people I value the opinion of or need to get adhesion from were full of confidence and perfectly fine speaking in public. It would make my life easier. Sadly they are not so I sometimes have to push them in the swimming pool. Hopefully at some point they will realise they are perfectly able to swim. In the meantime, well, tough love it’s gonna be.
I think throwing chalk could just be gaming metrics. If people are interested it shows that what's happening is valuable, they benefit from what's being presented or have input that the presenter needs and they want to get that across. What the organization should care about is not that people ask questions or give feedback, it should care that time is being used well. That is we know this meeting isn't a waste of time because people are interested active participants and if people just sit quietly and wait for thing to be over maybe it wasn't that useful.
When you force people to talk you are getting the metric (people asked questions) but because you are forcing it the metric becomes disconnected from what you actually care about - was this meeting a waste of time. You haven't actually improved things you've just obfuscated the problem.
I have worked with people who will just sit quietly and wait for things to be over even if they have questions just to not have to talk in a meeting and might ask them to you later if you are lucky.
It "works", but with negative side effects whose impact you seem to be underestimating the importance of. Meanwhile it's quite easy to encourage people to speak up more (and to take risks while doing so) via other, infinitely more decent and respectful means.
It's a "leadership thing" because most good leaders have learned that it's useful to invite others to share their opinion before you do so yourself. Especially in more authority-style cultures, because you'll get better input that way. Then people aren't trying to just agree with whatever the person in a leadership role said.
I think there is a nuance here, where this is only useful if you do the work to tailor it to that person. ie "Bob, I know your team has had [X] concern in the past. From the [Y] perspective, does this seem useful to you?"
The meta point being I don't think there are simple tricks or shortcuts to better participation.
Then poor Bob has to fumble to find the mute button and confess he hasn't paid attention to what you have been droning on about for the last 2 minutes and asks you to repeat the question.
If you're asking an expert in the area of what the topic is about, or stakeholders, or those that maybe have that topic in their circle of competence, then why would it be considered bullying? You'd expect them to have an opinion or some sort of feedback.
If you're asking someone who has literally nothing to do with that area of your work then, that's just kind of a weird situation. The key is tying the question to the area that is impacted by the person you're asking. It could be an open ended question and not even super specific.
For engagement, one thing that works from me is telling people before the demo/presentation that you'd like them each to share their "feedback" afterwards. But, replace "feedback" with a very specific thing that makes sense for the situation.
So, let's say the demo is on a specific feature, then you might preface the demo by saying this is the way that you decided to solve the problem, but you want to hear from everyone on how you could have done it differently.
That gives people the cue to be actively engaged by putting them between the problem and the solution. It also gives them fair warning that you're going to call on them afterwords, so it gives them time to think of something that they won't feel embarrassed to say (some people need time to be creative, while others think about alternatives and questions on the fly).
Either way, if people aren't speaking up then they might not be engaged. That doesn't mean they weren't listening, it just means they don't know why they are there. Maybe it really is a waste of their time, or maybe they just think it's a waste of their time because they don't understand the expectations or purpose.
My dad was a successful executive and he gave me several pieces of advice when I joined the workforce. I think #2 fits in this case, "Expectations are to people, what oil is to an engine".
The question, though, is why are you presenting live in the modern age in the first place? Perhaps 100 years ago there wasn't much choice but to form an assembly in a meeting room to present your information, but with all the technology we have today – and especially when you are working from home – that is completely unnecessary and counterproductive.
A meeting can be very valuable, but only when the meting takes place after the participants have taken in the presentation and are ready to discuss it. After they've understood the material and have gathered any additional context needed to relate the information to the bigger picture.
In my experience, when you do this the people involved will be much, much more engaged and you will see far more productive results from that. The "Let me bombard you with new information without the full context. Any questions?" meetings never go anywhere because the attendees haven't come prepared with the context they need. The people generally want to be engaged, but when something is sprung on them, it is hard to work with. Even when people do speak up, you can hear that they're struggling to relate it all together. Things are very different when it is an "after the fact" meeting.
I suppose it is fun to reenact the past once in a while, but it is strange that we want to do it so frequently when it so clearly falls flat.
> Recently a fellow developer was doing a demo of an automated UI testing suite and how it could apply to our product, and when it came time for questions or to show any sort of interest at all, its just crickets.
Get back to me in a day or two after I've had time to read the documentation, play with it for a while, and, most importantly, think about it, and I might have some questions or comments. In the moment watching someone else dick around? No chance. Even if I wanted to participate beyond your expectations, my mind will be blank. Guaranteed.
> For me its frustrating
I too am frustrated by these types of meetings. An email/Slack/whatever message containing "Hey! Check out this UI testing framework. Think it would work for us?" would provide just as much information as the presentation, while allowing more time to actually investigate to a necessary depth and come back with a constructive response.
A followup meeting to discuss the merits of the technology after everyone has had a chance to consider it aren't so bad. When these are hosted I find people are much more engaged and interesting discussion comes of it. I have no qualms about being challenged or "seeming dumb" in these meetings.
> I wish I were surrounded by people that are more willing to give their 2 cents
Whereas I wish I were surrounded by more people who were interested in software engineering, not being an actor in amateur live theatre. Not that there is anything wrong with the latter, but time and place. Nobody wants to see your presentation at work. Sorry.
IMHO this effect is much worse in video meetings, partly because of the lag and just the whole experience where social cues are muted.
In a real meeting, someone who has something to add will actually have different facial expressions that can be read by others. It just feels so much easier to gradually cut in without feeling like you might be talking over someone.
Much of the time that I don't have anything to say in a meeting, it's because I don't think fast enough to have anything to contribute on the spot. If someone posted the content of the meeting as a text post in a Slack channel, I'd read it, stew on it, and then probably end up writing 2–6 paragraphs of thoughts about it about five hours later. But you want those 2–6 paragraphs right now? I haven't thought of them yet!
And, IMHO, this is the main reason "async meetings" (e.g. email threads) are an improvement over sync meetings. Why put people on the spot, when you know you'll end up getting only a fraction of their mental capacity out of the deal? "War rooms" are for emergencies, not for creative thinking.
> Why put people on the spot, when you know you'll end up getting only a fraction of their mental capacity out of the deal?
I don't get it either. In a meeting the other day I was asked about something I did six months ago. I responded with something to the effect of "Let me refresh my memory and I'll get back to you", but the boss laid on the pressure "well, why don't we try to figure it out now?" So we spent a lengthy amount of time talking about it and reached a conclusion.
When the meeting was complete I spent a few minutes fully engrossing myself in that work, like I wanted to do originally, and realized that what we concluded was wrong. Following that realization, I got back to them with the correct response... What a waste of time that was. It's not like we are talking about how nice the weather is. Technical discussion requires a lot of information that usually isn't available in the moment.
As a 20 year veteran to working from home, I have worked with teams who have embraced async communication in the past and it is amazing how much more productive it is. Now that everyone and their brother think they can work from home, without having built working from home skills, it's been interesting to say the least.
> I feel obligated to participate in situations like these, reach for questions or at least acknowledge other's hard work
There must be two (or more) schools of thought on this then. I am the opposite. It's painfully obvious when people are just asking questions for the sake of it. It's completely pointless and wasting everyones times. Another thing I see is where the Asker wants to make a point about something. They make their point and then tack on a question at the end.
Sounds like more of a symptom of bad meetings. Maybe no one cared and only attended because they felt pressured to show up. Ime a more common complaint about meetings is along the lines of "why am I here", not "I'm to scared to chime in".
In your situation, someone doing some unsolicited sales pitch of something that I probably don't need doesn't automatically deserve my attention. They aren't entitled to any acknowledgment from me.
Isn't it reasonable to assume the person spending a lot of extra time programming will generally become a better programmer than someone who doesn't? I'm aware that there are many factors that have nothing to do with raw programming skills which are factors of job performance. But considering the day to day duties of most junior/ic roles often consists mostly of programming, shouldn't this extra time spent programming make them, at least eventually, better, more efficient developers on-the-clock?
While I (thankfully) haven't had to write Markdown from my phone, I think this is might be worth exploring. I have very similar experiences trying to find common programming symbols in the Google keyboard on my Pixel.
I've found it easier to work in C# codebases that just drank the Microsoft Kool-Aid with "Clean architecture" instead of Frankenstein-esque C# projects that decidedly could do it better or didn't care or know better.
Abstraction/design patterns can be abused, but in C#, "too many interfaces" doesn't seem that problematic.