See Any Way to Optimize This?

Discussion in 'Programming' started by TheGamingGrunts, May 5, 2015.

  1. Hey guys! So, below is a snippit from a project of mine and I was wondering if you guys see any way it can be optimized to be faster and more efficient, especially when handling larger sets of data. Thanks :)

    Code (Java):
    public class ThunderboltManager implements Thunderbolt {

        private Map<String, ThunderFile> fileMap = new HashMap<String, ThunderFile>();

       //removed code...

        public ThunderFile load(String name, String path) throws FileLoadException, IOException{
                name = Validator.checkName(name);
                if (fileMap.get(name) == null){
                    File f = new File(path + File.separator + name + ".json");
                    if (f.exists()){
                        ThunderFile tf = null;
                        if (f.length() != 0){
                            BufferedReader br = new BufferedReader(new FileReader(f));
                            String line;
                            String jsonData = "";
                            while ((line = br.readLine()) != null){
                                jsonData += line + "\n";
                            tf = new ThunderFile(name, path, jsonData);
                            JSONObject obj = tf.getJSONObject();
                            Iterator<?> i = obj.keySet().iterator();
                            while (i.hasNext()){
                                String key = (String);
                                tf.set(key, obj.get(key));
                        tf = (tf != null) ? tf : new ThunderFile(name, path);
                        fileMap.put(name, tf);
                        return tf;
                        return this.create(name, path);
                    throw new FileLoadException(name);
  2. Here's a hint to optimizing it. Holy mother of cascading scoped statements.
  3. you might want to try using java.nio for non-blocking feature...