1.14.4 Little Question (Best way for removing dropped items)

Discussion in 'Spigot Plugin Development' started by JoeyPlayzTV, Feb 4, 2020.

  1. Hey, i will improve my serversystem. I have a system that clears dropped items.

    Currently i have this.

    Code (Java):
    for(Entity entity : world.getEntitys()){
    if(entity instanceof Item) entity.remove();

    But there is another way (more advanced)
    Code (Java):
    For(Item item : world.getEntitybyClass(Item.class)){
    Now my question. Wich is the better way? I think the method 2, because it's getting only the entitys i want to remove instead of looping throught all entitys in the world and checking there type..

    Thanks for answers.
  2. Define "better".
    Performance wise they shouldn't differ much.
    getEntitybyClass(...) will still loop through all entities and check with isAssignableFrom(...)
    for the entities class. So both is fine.
    If you want to introduce conditions for the removal of items (with a Predicate<Item> for example) then the second one is
    more convenient as you don't have to cast the entities while iterating.
    • Agree Agree x 2
  3. 7smile7 is correct. Nothing more need to be said on the matter. :)
  4. The only way to make it better, is to turn it into one excruciating line.

    Code (Text):
    Bukkit.getWorlds().forEach(world -> world.getEntities().stream().filter(entity -> entity instanceof Item).forEach(Entity::remove));
    There, now anyone reading your code needs to have experience with streams and lambdas to understand what's going on. Mission accomplished :)

    Your code is fine, there are plenty of different ways to do things. Don't worry about optimization all the time. Finish your task at hand, and if performance is an issue then resolve it.
    • Optimistic Optimistic x 1
  5. This is actually significantly less efficient than having the for loop. It may look fancy but it will run worse.
  6. You can replace the instanceOf call in filter with method reference.

    Code (Text):

    Yes I think around 5 times less efficient with small amount of entries.

  7. Yea iam already working with predicates and lamda expressions. But i have simplyfied the code for the forum, so people there dont know about predicate dont gets confused.

    Thanks for all your suggestions.
  8. Streams generate a certain overhead, sequential streams are unlikely to be worthwhile however if you have a real paralell intensive problem a paralell stream might win out. I didnt benchmark this, but an instanceof operation should be capable of running in paralell so provided you got enough cores available you would win out.
  9. Oof not so sure about that. The joining of parallel streams for further serial computing is also quite a bit of overhead.
    I did a ton of benchmarks in order to understand where the overhead exactly comes from but parallel streams always came out
    worse unless you got a lot of elements with a lot of chained parallel operations.
    (Could be wrong tho. Parallel streams seem so unpredictable...)
  10. Do you have information about the break even point in your example? Like from how many entries the stream is faster? 10K, 100K, 1M, 10M?
  11. I was making a joke, but I guess I wasn't obvious enough with it.
  12. Don't make 'jokes' on a help forum and make it look like an answer...
  13. An stream is slower on lower value amount and faster on mass of data. You can check it out with following code

    Code (Java):
    long start = System.currentTimeMilis();
    yourlist.removeIf(entry -> entry.startWith("tom"));
    long complete = System.currentblabla - start;

    The method above is fast on 10000 Tom's and extrem slow on 10 Tom's

    Put 10000x Tom in the list, run the code and then try it out with only 10 Tom's
    • Agree Agree x 1
  14. If you have a lot of entities you might be better off removing entities using a predicate filter and returning false every time to prevent the method from filling a temporary list. Haven’t measured this though.